Friday, December 28, 2012

The Power of Business Rules Management

There is a growing interest in business rules and business rules management systems or business rules engines. Major software vendors, from IBM, Oracle and SAP to RedHat and SAS, have or are developing business rules management systems.

The use of these systems to support the practice of decision management to automate and manage high-volume, transactional decisions is growing rapidly. A new standard, called the Decision Model and Notation standard, is under development that will bring consistency of representation to the industry. Yet there is still a sense that this is a niche technology, and it is somewhat poorly understood outside of its traditional areas of strength. So what is a BRMS and how does it support decision management?

What is Decision Management?

First we need to discuss decision management. Decision management is a business approach that explicitly focuses on the management and automation of business decisions, especially the day-to-day operations that must be made to complete an operational process or handle a specific transaction. This approach brings together business rules and various analytic techniques and is widely used to effectively apply BRMSs. While there are many other things that you can do with business rules (e.g., improve data quality, manage user interfaces, etc.), the use of business rules to manage decisions is what makes a BRMS compelling.

If you want to know more about decision management, you can check out my columns, entitled "Building Agile, Analytic and Adaptive Systems" and "Four Principles of Decision Management Systems."

Business Rules Management Systems

A BRMS is a complete set of software components for the creation, testing, management, deployment and ongoing maintenance of business rules or decision logic in a production operational environment. These systems used to be, and sometimes still are, called business rule engines. However not all BRMSs use a BRE at all (some generate code) and even when a BRMS includes a BRE, it is just one part of a complete system — an important part, but one that deals only with execution. A BRE determines which rules need to be executed in what order. A BRMS is concerned with a lot more, including:

  • The development and testing of the rules
  • Linking business rules to data sources
  • Deploying business rules to decision services in different computing environments
  • Identifying rule conflicts and quality issues
  • Enabling business user rule maintenance
  • Measuring and reporting rule effectiveness

To deliver on all this, a BRMS needs a robust set of capabilities for managing decision logic, such as those documented in my Decision Management Systems Platform Technologies Report and shown in Figure 1 (click here or see left for image), The Elements of Decision Logic Management Supported by a Typical BRMS. Specifically, you need:

  • An enterprise-class rule repository with audit trails and versioning.
  • Technical rule management tools that allow technical users (e.g., developers and architects) to integrate business rules with the rest of the environment and edit/manage technical rules.
  • Non-technical rule management tools that allow business analysts and even business users to routinely change and manage business rules (see below).
  • Verification and validation tools, usable by both technical and business users, that take advantage of the nature of business rules to make sure they are correct and complete.
  • Testing and debugging tools to confirm that you get the decisions you were expecting.
  • Deployment tools supporting multiple platforms that allow the logic you have specified to be deployed into a decision service (see below).
  • Data management capabilities to bring real enterprise data into the environment so rules can be written against it.
  • Impact analysis tools so that a user can see what the impact of a change will be before he or she makes it.
  • Either a high-performance BRE to which the rules can be deployed or an ability to generate code that can be deployed.
  • An ability to support the logging of rule execution, so you can tell exactly how a particular decision was made and which rules were executed.

It should be noted that a modern BRMS is likely to support the management of rules derived from optimization and analytic tools as well as rules specified explicitly by a user.

Decision Services

Decision services are the link from our BRMS, to a focus on managing decisions, to decision management. These are sometimes called transparent decision services, agile decision services or even decision agents. Decision services are the key technical deliverable from the combination of a BRMS and the decision management approach. A decision service is a self-contained, callable service or agent with a view of all the information, conditions and actions that need to be considered to make an operational business decision. Deployed on a service-oriented infrastructure and available to other services, to service-enabled applications and to business processes managed using a business process management system, decision services package up all the rules (and any analytics) that go into making a decision.

Decisions can often be thought in terms of a question for which there is a known allowed set of answers. For instance, a decision about routing an insurance claim might be thought of as the question, “How should we handle this claim?” Allowed answers include auto pay, fast track, refer to claims adjuster or refer to fraud investigation group. A decision service handling claims routing would take the information about the claim and then return one of the allowed answers to a calling process or service. In other words, a decision service answers a business question for other services.

One of they main focus these days.

Wednesday, December 12, 2012

Here's a Better Way to Remember Things

A group of Brazilian entrepreneurs who have come north for a week's worth of ideas on growing their ventures, are leaving a class, when one of them breaks from pack toward the coffee maker, where I'm heading too. He works the machine first, reciting something again and again in Portuguese as he watches his cup fill.

"Excuse me?," I say, unsure he's talking to me.

"Sorry, I am repeating what the lecturer said," he explains, "so I remember later."

Remembering new information is an underappreciated skill. The fact that most of us have never evolved our technique beyond the rudimentary and ad hoc approaches we used as middle schoolers suggests this. It is required for any sort of professional growth, since the need to learn is high, and can separate the exceptional performances from the mediocre ones. After all, would you prefer to hire the consultant who presented using cue cards or the one who pitched from memory?

Fortunately for us, insights from cognitive psychology have vastly improved our understanding of how we remember. Many of these are accepted wisdom in the neurological and psychological realms. But it hasn't been easy to transfer that knowledge to actual tools for individuals. Until recently, anyway. Easy-to-use auto-analytic tools that exploit our understanding of memory can now help you treat remembering as the skill it is, and improve it the same way you improve any professional skill, like public speaking. Here's how to get started.

First, focus on the right unit of measure. Yes, your objective is to remember better, but you'll get the best results by focusing on forgetting as your base unit of analysis.

Experimental psychologist Hermann Ebbinghaus's pioneering discovery of the forgetting curve shows that we forget the majority of newly learned information within hours or days, unless we review it again and again. This alone won't be a shock to many of us. But Ebbinghaus demonstrated how systematic forgetting. It occurs exponentially on a predictable curve — researchers call this "exponential decay."

scenariosblue.jpeg

Different things you're trying to remember will have different curves. For instance, that piece of operations data that you remember clearly, since you prepped and presented it to your team, has a flatter downward curve (you'll remember longer) than that the now hazy sales figure a colleague mentioned during the same team meeting. Evenso, each curve is predictable.

Practice remembering at the right time. Think about how you really use your memory for things that matter to you and your career, like in preparing for a speech. Maybe you're a crammer who tries to prime your memory by doing as many dry-runs as possible the night before. Or perhaps you've committed to ploddingly rehearsing your lines each afternoon for a month from 3 pm to 4 pm. Or maybe you're an improviser who finds time here and there, rehearsing what you'll say at random moments between meetings.
The forgetting curve suggests you should follow a very different memorization process than any of these entail. It shows that there's a precise moment that's best for practicing your lines. That moment is just before you are about to forget them.

So sessions aimed at learning new content should happen at "about-to-forget" moments, with spaces between practice sessions increasing as you approach mastery. This learning process is called spaced repetition, and can help us avoid the inefficiencies and risks of ad hoc memorization methods like cramming.

Incorporate auto-analytics tools. OK, so you get the idea that you should try to commit things to memory only when you are just about to forget them. But how do you know when that critical moment is about to happen? How do you know what your forgetting curve looks like?

Almost like your fingerprint, your forgetting curve is very different from anyone else's. But a type of auto-analytics tool called "Spaced Repetition Software" or "SRS" can learn the idiosyncrasies of your memory, and then ping you to practice at the optimal time.

These mobile and desktop tools are like automated flashcards, though you work through your "pile" according to your personal algorithm and the rules of spaced repetition.
They fine-tune your algorithm using a straightforward rating system. Let's say you're a newly appointed manager learning some finance for the first time, and you're trying to improve your recall of many new terms. When the term "Leverage" appears you recall its meaning effortlessly and assign it an A. But when "Arbitrage" appears you assign it a D since you must labor to recall its basic meaning, and even then it remains fuzzy.

The tool continually hones its prompts based on your input. No doubt you'll see "Arbitrage" sooner than "Leverage," as practice sessions for the second concept would be scheduled later and less frequently to maximize efficient memorization.

Map your practice to your priorities. Finally, be very selective when choosing what you want to get better at remembering. In theory, you could work on mastering numerous new domains at once, but experimental research and case studies suggest this isn't practical for full time workers.

Focus instead on a single development opportunity integral to your career. (See the accompany chart for examples of cases where you could use SRS.) Does this opportunity require learning new terms, concepts, or narratives? If yes, then it makes sense to focus on hacking your memory with these computing tools to pursue it.

In short, when you're on a steep learning curve, remember the forgetting curve, and then beat it.

very useful, especially the tips to improve one's memory

Thursday, December 6, 2012

What a Big-Data Business Model Looks Like

The rise of big data is an exciting — if in some cases scary — development for business. Together with the complementary technology forces of social, mobile, the cloud, and unified communications, big data brings countless new opportunities for learning about customers and their wants and needs. It also brings the potential for disruption, and realignment. Organizations that truly embrace big data can create new opportunities for strategic differentiation in this era of engagement. Those that don't fully engage, or that misunderstand the opportunities, can lose out.

There are a number of new business models emerging in the big data world. In my research, I see three main approaches standing out. The first focuses on using data to create differentiated offerings. The second involves brokering this information. The third is about building networks to deliver data where it's needed, when it's needed.

Differentiation creates new experiences. For a decade or so now, we've seen technology and data bring new levels of personalization and relevance. Google's AdSense delivers advertising that's actually related to what users are looking for. Online retailers are able to offer — via FedEx, UPS, and even the U.S. Postal Service — up to the minute tracking of where your packages are. Map services from Google, Microsoft, Yahoo!, and now Apple provide information linked to where you are.

Big data offers opportunities for many more service offerings that will improve customer satisfaction and provide contextual relevance. Imagine package tracking that allows you to change the delivery address as you head from home to office. Or map-based services that link your fuel supply to availability of fueling stations. If you were low on fuel and your car spoke to your maps app, you could not only find the nearest open gas stations within a 10-mile radius, but also receive the price per gallon. I'd personally pay a few dollars a month for a contextual service that delivers the peace of mind of never running out of fuel on the road.

Brokering augments the value of information. Companies such as Bloomberg, Experian, Dun & Bradstreet already sell raw information, provide benchmarking services, and deliver analysis and insights with structured data sources. In a big data world, though, these propriety systems may struggle to keep up. Opportunities will arise for new forms of information brokering and new types of brokers that address new unstructured, often open data sources such as social media, chat streams, and video. Organizations will mash up data to create new revenue streams.

The permutations of available data will explode, leading to sub-sub specialized streams that can tell you the number of left-handed Toyota drivers who drink four cups of coffee every day but are vegan and seek a car wash during their lunch break. New players will emerge to bring these insights together and repackage them to provide relevancy and context.

For example, retailers like Amazon could sell raw information on the hottest purchase categories. Additional data on weather patterns and payment volumes from other partners could help suppliers pinpoint demand signals even more closely. These new analysis and insight streams could be created and maintained by information brokers who could sort by age, location, interest, and other categories. With endless permutations, brokers' business models would align by industries, geographies, and user roles.

Delivery networks enable the monetization of data. To be truly valuable, all this information has to be delivered into the hands of those who can use it, when they can use it. Content creators — the information providers and brokers — will seek placement and distribution in as many ways as possible.

This means, first, ample opportunities for the arms dealers — the suppliers of the technologies that make all this gathering and exchange of data possible. It also suggests a role for new marketplaces that facilitate the spot trading of insight, and deal room services that allow for private information brokering.

The most intriguing opportunities, though, may be in the creation of delivery networks where information is aggregated, exchanged, and reconstituted into newer and cleaner insight streams. Similar to the cable TV model for content delivery, these delivery networks will be the essential funnel through which information-based offerings will find their markets and be monetized.

Few organizations will have the capital to create end-to-end content delivery networks that can go from cloud to devices. Today, Amazon, Apple, Bloomberg, Google, and Microsoft show such potential, as they own the distribution chain from cloud to device and some starter content. Telecom giants such as AT&T, Verizon, Comcast, and BT have an opportunity to also provide infrastructure, however, we haven't seen significant movement to move beyond voice and data services. Big data could be their opportunity.

Meanwhile, content creators — the information providers and brokers — will likely seek placement and distribution in as many delivery networks as possible. Content relevancy will emerge as a strategic competency in delivering offers in ad networks based on the context by role, relationship, product ownership, location, time, sentiment, and even intent. For example, large wireless carriers can map traffic flows down to the cell tower. Using this data, carriers could work with display advertisers to optimize advertising rates for the most popular routes on football game days based on digital foot traffic.

There are many possible paths to monetize the big data revolution ahead. What's crucial is to have an idea of which one you want to follow. Only by understanding which business model (or models) suits your organization best can you make smart decisions on how to build, partner, or acquire your way into the next wave.

Big Data Business Models.jpg

Interesting usage of Big data

Saturday, October 6, 2012

To Succeed with Big Data, Start Small

While it isn't hard to argue the value of analyzing big data, it is intimidating to figure out what to do first. There are many unknowns when working with data that your organization has never used before — the streams of unstructured information from the web, for example. Which elements of the data hold value? What are the most important metrics the data can generate? What quality issues exist? As a result of these unknowns, the costs and time required to achieve success can be hard to estimate.

As an organization gains experience with specific types of data, certain issues will fade, but there will always be another new data source with the same unknowns waiting in the wings. The key to success is to start small. It's a lower-risk way to see what big data can do for your firm and to test your firm's readiness to use it.

The Traditional Way

In most organizations, big data projects get their start when an executive becomes convinced that the company is missing out on opportunities in data. Perhaps it's the CMO looking to glean new insight into customer behavior from web data, for example. That conviction leads to an exhaustive and time-consuming process by which the CMO's team might work with the CIO's team to specify and scope the precise insights to be pursued and the associated analytics to get them.

Next, the organization launches a major IT project. The CIO's team designs and implements complex processes to capture all the raw web data needed and transform it into usable (structured) information that can then be analyzed.

Once analytic professionals start using the data, they'll find problems with the approach. This triggers another iteration of the IT project. Repeat a few times and everyone will be pulling their hair out and questioning why they ever decided to try to analyze the web data in the first place. This is a scenario I have seen play out many times in many organizations.

A Better Approach

The process I just described doesn't work for big data initiatives because it's designed for cases where all the facts are known, all the risks are identified, and all steps are clear — exactly what you won't find with a big data initiative. After all, you're applying a new data source to new problems in a new way.

Again, my best advice is to start small. First, define a few relatively simple analytics that won't take much time or data to run. For example, an online retailer might start by identifying what products each customer viewed so that the company can send a follow-up offer if they don't purchase. A few intuitive examples like this allow the organization to see what the data can do. More importantly, this approach yields results that are easy to test to see what type of lift the analytics provide.

Next, instead of setting up formal processes to capture, process, and analyze all of the data all of the time, capture some of the data in a one-off fashion. Perhaps a month's worth for one division for a certain subset of products. If you capture only the data you need to perform the test, you'll find the initial data volume easier to manage and you won't muddy the water with a bunch of other data — a problem that plagues many big data initiatives.

At this point, it is time to turn analytic professionals loose on the data. Remember: they're used to dealing with raw data in an unfriendly format. They can zero in on what they need and ignore the rest. They can create test and control groups to whom they can send the follow-up offers, and then they can help analyze the results. During this process, they'll also learn an awful lot about the data and how to make use of it. This kind of targeted prototyping is invaluable when it comes to identifying trouble and firming up a broader effort.

Successful prototypes also make it far easier to get the support required for the larger effort. Best of all, the full effort will now be less risky because the data is better understood and the value is already partially proven. It's also worthwhile to learn that the initial analytics aren't as valuable as hoped. It tells you to focus effort elsewhere before you've wasted many months and a lot of money.

Pursuing big data with small, targeted steps can actually be the fastest, least expensive, and most effective way to go. It enables an organization to prove there's value in major investment before making it and to understand better how to make a big data program pay off for the long term.
_____________________

BIG DATA INSIGHT CENTER

  • Get Started With Big Data: Tie Strategy to Performance
  • What If Google Had a Hedge Fund?
  • Can You Live Without a Data Scientist?
  • How to Repair Your Data
  • More >>

    Successful prototypes is certainly the key with a small but reasonable sample size.

    How to Present to Senior Executives

    Senior executives are one of the toughest crowds you'll face as a presenter. They're incredibly impatient because their schedules are jam-packed — and they have to make lots of high-stakes decisions, often with little time to weigh options. So they won't sit still for a long presentation with a big reveal at the end. They'll just interrupt you before you finish your shtick.

    It can be frustrating. You probably have a lot to say to them, and this might be your only shot to say it. But if you want them to hear you at all, get to what they care about right away so they can make their decisions more efficiently. Having presented to top executives in many fields — from jet engines to search engines — I've learned the hard way that if you ramble in front of them, you'll get a look that says, "Are you kidding me? You really think I have the time to care about that?" So quickly and clearly present information that's important to them, ask for questions, and then be done. If your spiel is short and insightful, you'll get their ear again.

    Here's how you can earn their attention and support:

    Summarize up front: Say you're given 30 minutes to present. When creating your intro, pretend your whole slot got cut to 5 minutes. This will force you to lead with all the information your audience really cares about — high-level findings, conclusions, recommendations, a call to action. State those points clearly and succinctly right at the start, and then move on to supporting data, subtleties, and material that's peripherally relevant.

    Set expectations: Let the audience know you'll spend the first few minutes presenting your summary and the rest of the time on discussion. Even the most impatient executives will be more likely to let you get through your main points uninterrupted if they know they'll soon get to ask questions.

    Create summary slides: When making your slide deck, place a short overview of key points at the front; the rest of your slides should serve as an appendix. Follow the 10% rule: If your appendix is 50 slides, create 5 summary slides, and so on. After you present the summary, let the group drive the conversation, and refer to appendix slides as relevant questions and comments come up. Often, executives will want to go deeper into certain points that will aid in their decision making. If they do, quickly pull up the slides that speak to those points.

    Give them what they asked for: If you were invited to give an update about the flooding of your company's manufacturing plant in Indonesia, do so before covering anything else. This time-pressed group of senior managers invited you to speak because they felt you could supply a missing piece of information. So answer that specific request directly and quickly.

    Rehearse: Before presenting, run your talk and your slides by a colleague who will serve as an honest coach. Try to find someone who's had success getting ideas adopted at the executive level. Ask for pointed feedback: Is your message coming through clearly and quickly? Do your summary slides boil everything down into skimmable key insights? Are you missing anything your audience is likely to expect?

    Sounds like a lot of work? It is, but presenting to an executive team is a great honor and can open tremendous doors. If you nail this, people with a lot of influence will become strong advocates for your ideas.

    This is the first post in Nancy Duarte's blog series on creating and delivering presentations, based on tips from her new book, the HBR Guide to Persuasive Presentations (October 2012).

    A skill that is highly needed in nowadays challenging business.