Fast Answers Through Embedded Business Intelligence by Chris Trayhorn, Publisher of mThink Blue Book, May 14, 2007 Utilities have long been on a quest for better ways to handle and make sense of data. Few other industries need control over such huge data volumes simply to handle the everyday aspects of their business locating assets, dealing with customers and their consumption, directing field crews, repairing outages and addressing ever-increasing requirements for efficiency, cost control and community contributions. It is not surprising, then, that utilities were prime candidates for the earliest forays into business intelligence. Initial, Partial Success In the 1980s, reporters dominated the scene. Though seemingly straightforward, they proved difficult to use. IT experts had to not only translate business requirements but also to know precisely what data was stored, where it was stored, how the value was populated and what caused it to change. Coding was an IT art of the highest order, requiring specialized knowledge of proprietary tools that were time-intensive and laborious. Thus, information users could not write or modify reports and had to wait months for IT to make even high-priority changes. That essentially guaranteed that any information gleaned would be out of date. Utility business intelligence made a major leap forward with the development of knowledge warehouses and data marts. These hubs of mined information from varying data sources were generally separated from the production environment so as not to risk the integrity of live data or hinder performance of production systems. They also permitted online analytical processing (OLAP) fast, interactive and, above all, multidimensional information access that facilitates analysis via presentations like the Balanced Scorecard. Data mining technology took us a step further in helping users to find patterns in large amounts of data. As with report writing, however, data warehouse implementation generally required lengthy analysis, programming and setup complex tasks with which few in-house IT experts could cope. Options and compromises were many and decisions hard to come by. Many companies became mired in analysis paralysis, and many projects never produced concrete results. As a consequence, most organizations attempting to implement data warehouses hired busloads of outside experts from the major consulting firms for months or even years at a time. Even when experts implemented data warehouses, users still faced the same problem they had had with report writers: They could not make changes or ask todays most relevant questions. So, few of these early projects succeeded in attaining the business decision improvements hoped for. Some got partway to their goal, if only for a short time. Many declared victory and then sank beneath the waves of an annual reports footnotes. And data remained isolated within its own individual departmental silos. New Approaches In the 21st century, applications vendors have begun to build a new road to business intelligence. Its raw materials are: New delivery techniques, especially Web-based portals and the technology that lets users easily customize them. Pre-built extracts from common commercial packages such as ERP, CIS, etc., that require less specialized expertise. Aggregation, bit mapping and other OLAP optimization techniques built into standard database products, making them ubiquitous and easier to use. Data warehousing techniques that support real-time or near-real-time data feeds, effectively supporting the analysis of rapidly changing operational data. Standards that allow simple interaction between the OLAP results and business processes on the front end, closing the loop between measuring results and improving processes. Perhaps most important, tools that bridge the gap between the business and IT departments,requiring less translation between roles and allowing each to work in their own specializations to get the job done. Results Out of the Box The result of this initiative is embedded business intelligence solid, quick and easy analytical capability across an enterprise. This new approach, generally designed with a specific industry in mind, automatically collects, stores and analyzes data within a live application environment, hence the common term for it, embedded BI. It permits users from the CEO down to get the data they need presented in virtually any manner on a regular basis or in response to on-the-fly queries. It draws together information from different databases and presents results in easily accessible, graphical forms. As a result, managers and staff at any level can perform the updates, historical comparisons and cross-organizational comparisons that enhance real-time decision making. And, though out of the box and easy to upgrade, these products still have the flexibility to permit sophisticated customization where required. Hesitation Embedded BI is not a complete replacement for the large, complex BI projects custom-built from vendor-provided tools, which can: Draw data from a larger variety of sources; Respond to an organizations specific needs; and Use vocabulary and other approaches unique to the organization. But the cost of these customized solutions is very high easily an order of magnitude higher than the typical cost of embedded approaches that are linked to a specific application. And, over time, even the simplest and most straightforward custom applications tend to become burdened with modifications to accommodate the demands of individual users. These changes may seem logical, particularly those that minimize the time users must work with the application before getting their results, but as they accumulate, they lower system performance and degrade overall processing. The outcome is frequently the BI Franken-App complex solutions that become increasingly difficult to maintain and change over time. Why Choose Embedded BI? Embedded business intelligence is different. Linked to a specific application or set of applications, it provides such benefits as: Significantly lower cost. The ability to tailor applications to specific needs and to retain those changes over time in other words, tailoring without loss of upgradability. The ability to perform prototyping to provide near-term approximate answers while accelerating the ultimate solution design. With embedded BI, organizations do not have to spend weeks or months analyzing a specific problem before embarking on design. Embedded BI permits the setup of a quick prototype for testing, which can then be improved by going back and refining the ETL, the schema definitions, the KPI definitions, etc., until the new solution produces appropriate results. Other benefits include rapid implementation; vendor-provided updates, maintenance/training; and exposure to and learning opportunities via the vendors community of users and experts. Not all embedded approaches are created equal, however. Among the elements that increase value are: A working framework from which to pull information and deliver value immediately, which will fast-track pertinent knowledge based on actual data. Look for solutions that are delivered with predefined data extracts and reporting structures. A starter kit of common metrics and measures that are used throughout the industry. Utilities companies should look for metrics that focus on such concerns as outage duration, preventive maintenance, bill accuracy, queries resolved during an initial call from the customer, and so on. Performance protection. While many business intelligence applications can operate comfortably within a production environment, that is not always the case. For very high volumes, its crucial to implement a solution that extracts data directly from the production operating system and holds it in a more efficient vehicle. This will maximize system performance and operational flow. Flexibility. Business intelligence effectiveness depends on users ability to adapt information presentation to the way they work not the other way around. Simple information access is not enough. Users must be able to filter and sort it in order to highlight the exact information they need when they need it. Predetermined analytics should provide not an end point but, instead, a launching pad from which users can create the performance indicators they need. Additionally, users will recognize problems and opportunities for improvement far more quickly when they can adapt graphical formats for fast and easy examination and analysis of evolving situations. The ability to add and subtract business facts and other data quickly and easily. Users need to be able to see the consequences of, for instance, new regulations under consideration, a sudden population increase or decrease, or a rise or fall in the price of fuel or other supplies. In some cases, users may need to add an entire new source of data to permit effective response to new business imperatives; the best BI applications make that easy. Provider expertise. The most effective solutions focus on the needs of a specific industry. The solution provider must have a broad knowledge of the industrys business applications, regulatory environment and unique business challenges. Appropriate sizing. An application that works with a single data source can address specific customer or organization needs. That will probably be inadequate, however, for executives who must address issues across multiple enterprisewide business processes. There is no denying, that embedded business intelligence requires balancing and trade-offs. Not all of these solutions can draw information from multiple, complex databases, and when they cant, missing information can degrade the decisionmaking process. A second contentious area is the extent to which an application accommodates unique organizational needs; accommodations that decrease user time and effort may also, in the long run, restrict application upgradability. Implementing Embedded BI Implementing an embedded business intelligence solution is vastly different from implementing customized BI. While custom BI solutions generally rest on extensive and extended analysis, out-of-the-box solutions, as explained above, rest on a prototyping process in which you begin with approximation and move iteratively closer to the ideal by redefining and redesigning. Successful prototyping means avoiding overanalysis. Instead its prudent to move relatively rapidly through initial steps that: Define KPIs. Begin with those most important to organizational success and add refinements later. Determine the reporting structures (star schemas) that support the KPIs. Map data sources to the structures. Determine scheduling. Extract-frequency depends on how often information changes and the consequences of using outdated information. For tasks like product introductions, managers may be able to make sound decisions based on trends shown across several months of historic data, the most recent of which may be a week or two old. For tasks like informing managers of progress on a current outage resolution against the previous years trends, extracts every 15 minutes may be too slow. These steps are all much easier with a vendor that provides considerable application power and guidance during the first implementation. Use the vendors experience and conventional wisdom to develop reasonable solutions to KPI measurement, then improve upon them with experience. Remember that the goal is getting a solution that starts producing answers almost immediately. While a vendor should also deliver long-term plans for upgrades, the initial solution has to fit todays needs along with the configurability and flexibility to support tomorrows unanticipated conditions and changing goals. Does Embedded BI Work? Before they give embedded business intelligence a try, many organizations will have already experienced the failure or only partial success of large-scale, customized business intelligence projects. Additionally, because embedded solutions are almost always connected intimately to an application or set of applications, organizations are likely to encounter different vendors versions of embedded BI as they move across the IT structure. Therefore, it may be possible to move beyond traditional measures of application success like how many people are using the product after a year or what is the level of user complaints. Here are some metrics with which to compare various approaches to BI: How long did it take to achieve usable results, i.e., were the KPIs defined in step 1 actually used or did the process take so long, the requirements changed before the solution was delivered? What were the costs in time and money? Were investments made in the application at the start usable throughout the project? In other words, could we add new data or change parameters along the way without throwing out everything accomplished to that point? Was rollout quick and reasonably painless? Web-based solutions tend to score high in this category, since their familiarity minimizes user training. They also ease the IT burden, since theres no software to roll out. The Case for Embedded BI Todays challenges and tomorrows trials and opportunities necessitate a state of readiness and business agility that can only be supported by rapid, ready access to business-critical information. Managers and executives need as much help as technology can offer to condense volumes of complex, disparate data from multiple, mission-critical data sources into a cohesive knowledge base. From this base, they can identify risks, determine trends, more accurately forecast, and identify causeand- effect relationships that might not have otherwise been apparent. With this information, organizations can optimize operations, reduce the cost of servicing customers and identify opportunities to sell new products and services. This level of intelligence promotes a culture of continuous improvement, which enhances the ability to predict changing market conditions, and it offers executives the ability to highlight and respond to concrete opportunities for business optimization and an enhanced bottom line. Filed under: White Papers Tagged under: Utilities About the Author Chris Trayhorn, Publisher of mThink Blue Book Chris Trayhorn is the Chairman of the Performance Marketing Industry Blue Ribbon Panel and the CEO of mThink.com, a leading online and content marketing agency. He has founded four successful marketing companies in London and San Francisco in the last 15 years, and is currently the founder and publisher of Revenue+Performance magazine, the magazine of the performance marketing industry since 2002.