Tomorrow’s Bill Payment Solutions for Today’s Businesses

Providing consumers with innovative services for more than 150 years, Western Union is an established leader in electronic and cash bill-payment solutions. We introduced our first consumer-to-consumer money transfer service in 1871 and began offering consumer-to-business bill payment services in 1989 with the introduction of the Western Union Quick Collect® service, providing consumers in the United States with convenient walk-in agent network locations where they can pay bills in cash.

In 2008, our comprehensive suite of services has grown to include Speedpay® – an electronic bill payment option that provides businesses with Internet, IVR, desktop, mobile payments, online banking and call center solutions, as well as e-bill presentment with payments and interactive outbound messaging integrated with payment processing.

THE CONSUMER-TO-BUSINESS SEGMENT

Western Union’s electronic and cash bill payment services provide consumers with fast, convenient ways to send one-time or recurring payments to a broad spectrum of industries. At Western Union we have relationships with more than 6,000 businesses and organizations that receive consumer payments, including utilities, auto finance companies, mortgage servicers, financial service providers and government agencies. These relationships form a core component of our consumer-to-business payment service and are one reason we were able to process 404 million consumer-to-business transactions in 2007.

PORTFOLIO OF SERVICES

Our consumer-to-business services give consumers choices in payment type and method, and include the following options:

  • Electronic payments. Consumers and billers use our Speedpay® service in the United States and the United Kingdom to make consumer payments to a variety of billers using credit cards, ATM cards and debit cards, and via ACH withdrawal. Payments are initiated through multiple channels, including biller-hosted websites, westernunion.com, IVR units, Online Banking websites and call centers.
  • Cash payments. Consumers use our Quick Collect® or Prepaid® services to send guaranteed funds to businesses and government agencies using cash (and in select locations, debit cards). Quick Collect is available at nearly 60,000 Western Union agent locations across the United States and Canada, while our Prepaid service can be accessed at more than 40,000 U.S. locations. Consumers can also use our Convenience Pay® service to send payments by cash or check from a smaller number of agent locations primarily to utilities and telecommunication providers.

DISTRIBUTION AND MARKETING CHANNELS

Our electronic payment services are available primarily through an IVR, over the Internet and via Call Center using a desktop application while speaking with a biller’s customer service representative. Through our Quick Pay® service, it is possible to receive payments sent from outside the United States or Canada from over 320,000 agent locations in more than 200 countries and territories around the world. We work in partnership with our billers to market our services to consumers in a number of ways, including direct mail, Email, Internet and point-of-sale advertising.

ONLINE BANKING

In late 2007, Western Union launched its Online Banking initiative, helping to change the way consumers pay their bills. The channel accelerates the speed with which billers receive payment from two to four days to a next-day or same-day delivery, and enables Western Union Payment Services to process bill payments initiated by consumers from their banks’ online banking sites.

Western Union plans to work with the nation’s largest banks to provide your customers with a new class of online banking payment that allows them to make same- and next-day payments that are posted and funded to you faster and are of a higher quality than other online banking payments currently available.

EMAIL BILL PRESENTMENT AND PAYMENT

While the benefits of electronic bill presentment and payment are compelling for both billers and consumers, low consumer adoption rates have prevented billers from fully realizing the cost savings and improved customer service levels these services promote. Western Union® Payment Services aims to change this through its integration with Striata® Email bill presentment and payment (EBPP) solutions.

With this integrated, encrypted Email bill presentment and one-click payment service, consumers no longer need to register to receive their bill electronically, visit a separate website to download the bill and send a payment, or remember multiple user names and passwords. By removing these extra steps from the process, these services become dramatically easier to use for consumers.

The critical differentiator of the Western Union/Striata service is that the entire e-bill is delivered directly into the consumer’s in-box as an encrypted off-line attachment, enabling payment to be sent through the e-bill itself using the Western Union® Speedpay service. While complementary to existing online presentment solutions, this “push” Email billing offering can be more successful at driving adoption.

Bill Pay and Presentment Solutions for Utility Companies

Recognizing that not all customers view and pay bills in the same way, Check- Free helps you deliver a complete range of billing and payment options – from the traditional methods of receiving and paying bills by mail, in person and over the phone to complete paperless online billing and payment using either a bank or your website. CheckFree offers solutions that help you meet market demands.

Whether you need to improve a single solution or your entire offering, CheckFree can offer experience and expertise in the following payment channels:

  • By Mail. Some people still choose to receive paper bills and write checks. CheckFree can help turn these paper checks into ACH electronic debits, speeding payment collections.
  • In Person. Give your customers in-person payment convenience and choice to use cash, checks, money orders or merchant-issued certificates.
  • By Phone. Enable your customers to pay a bill anywhere they have access to a phone, all day, every day. With the recent acquisition of CheckFree by Fiserv, you can look for Fiserv’s industry-leading BillMatrix platform to be integrated into our suite of offerings.
  • Online. Deliver bill paying ease and convenience through CheckFree’s full range of electronic billing and payment (EBP) solutions at your site and beyond your site.
  • Emergency Payments. Offer a fee-based option for last-minute online payments and eliminate expenses due to delinquent payments.
  • Electronic Remittance. Provide quicker access to payment funds while reducing the cost of processing paper checks.

CUSTOMER INTERACTION OPTIMIZATION

CheckFree solutions enable you to optimize each customer interaction by offering multiple payment channel options that focus on security, reliability, functionality and convenience. Each interaction with the consumer represents an ideal opportunity to enhance the customer experience and build loyal customers.

Our Customer Interaction Optimization solutions make interactions a win/win for both you and your customers. You deliver the payment channels they seek while maintaining the ability to guide them to the most profitable channel for your organization. The ultimate business objective is to steer customers to the lower cost-to-serve billing and payment option: the online channel.

CheckFree understands your company’s strategic need to direct consumers to the optimal online channel to enhance revenue growth through reductions in operating costs. By investing in substantial consumer behavior, segmentation and marketing research, CheckFree can assist with creating marketing campaigns focused on promoting your online channel. Every bill received, payment made or visit to your website can be utilized to strategically drive adoption of online bill pay, e-bills and paper shut-off.

For more than 25 years, CheckFree has been a leading provider of electronic billing and payment services. We process more than one billion electronic payments each year. With CheckFree’s Customer Interaction Optimization solutions, you can enhance your payment offerings while improving your bottom line.

About Alcatel-Lucent

Alcatel-Lucent’s vision is to enrich people’s lives by transforming the way the world communicates. Alcatel-Lucent provides solutions that enable service providers, enterprises and governments worldwide to deliver voice, data and video communication services to end users. As a leader in carrier and enterprise IP technologies; fixed, mobile and converged broadband access; applications and services, Alcatel-Lucent offers the end-to-end solutions that enable compelling communications services for people at work, at home and on the move.

With 77,000 employees and operations in more than 130 countries, Alcatel-Lucent is a local partner with global reach. The company has the most experienced global services team in the industry and includes Bell labs, one of the largest research, technology and innovation organizations focused on communications. Alcatel-Lucent achieved adjusted revenues of €17.8 billion in 2007, and is incorporated in France, with executive offices located in Paris.

YOUR ENERGY AND UTILITY PARTNER

Alcatel-Lucent offers comprehensive capabilities that combine carrier-grade communications technology and expertise with utility industry- specific knowledge. Alcatel-Lucent’s IP transformation expertise and utility market-specific knowledge have led to the development of turnkey communications solutions designed for the energy and utility market. Alcatel-Lucent has extensive experience in:

  • Transforming and renewing network technologies;
  • designing and implementing SmartGrid initiatives;
  • Meeting NERC CIP compliance and security requirements;
  • Working in live power generation, transmission and distribution environments;
  • Implementing and managing complex mission-critical communications projects;
  • developing best-in-class partnerships with organizations like CURRENT Communications, Ambient, BelAir networks, Alvarion and others in the utility industry.

Working with Alcatel-Lucent enables energy and utility companies to realize the increased reliability and greater efficiency of next-generation communications technology, providing a platform for – and minimizing the risks associated with – moving to SmartGrid solutions. And Alcatel-Lucent helps energy and utility companies achieve compliance with regulatory requirements and reduce operational expenses while maintaining the security, integrity and high availability of their power infrastructure and services.

ALCATEL-LUCENT IP MPLS SOLUTION FOR THE NEXT-GENERATION UTILITY NETWORK

Utility companies are experienced at building and operating reliable and effective networks to ensure the delivery of essential information and maintain fl awless service delivery. The Alcatel-Lucent IP/MPLS solution can enable utility operators to extend and enhance their networks with new technologies like IP, Ethernet and MPLS. These new technologies will enable the utility to optimize its network to reduce both capital expenditures and operating expenses without jeopardizing reliability. Advanced technologies also allow the introduction of new applications that can improve operational and workflow efficiency within the utility. Alcatel-Lucent leverages cutting-edge technologies along with the company’s broad and deep experience in the utility industry to help utility operators build better, next-generation networks with IP/MPLS.

THE ALCATEL-LUCENT ADVANTAGE

Alcatel-Lucent has years of experience in the development of IP, MPLS and Ethernet technologies. The Alcatel-Lucent IP/MPLS solution offers utility operators the flexibility, scale and feature sets required for mission-critical operation. With the broadest portfolio of products and services in the telecommunications industry, Alcatel-Lucent has the unparalleled ability to design and deliver end-to-end solutions that drive next-generation communications networks.

Delivering the Tools for Creating the Next-Generation Electrical SmartGrid

PowerSense delivers cutting-edge monitoring and control equipment together with integrated supervision to enable the modern electrical utility to prepare its existing power infrastructure for tomorrow’s SmartGrid.

PowerSense uses world-leading technology to merge existing and new power infrastructures into the existing SCADA and IT systems of the electrical utilities. This integration of the upgraded power infrastructure and existing IT systems instantly optimizes outage and fault management, thereby decreasing customer minutes lost (the System Average Interruption duration Index, or SAIDI).

At the same time, this integration helps the electrical utility further improve asset management (resulting in major cost savings) and power management (resulting in high-performance outage management and a high power efficiency). The PowerSense product line is called DISCOS® (for distribution networks, Integrated Supervision and Control System).

Discos®

The following outlines the business and system values offered by the DISCOS® product line.

Business Values

  • Cutting-edge optical technology (the sensor)
  • Easily and safely retrofitted (sensors can be fitted into all transformer types)
  • End-to-end solutions (from sensors to laptop)
  • Installation in steps (implementation based on cost-benefit analysis) system Values
  • Current (for each phase)
  • Voltage (for each phase)
  • Frequency
  • Power active, reactive and direction
  • Distance-to-fault measurement
  • Control of breakers and service relays
  • Analog inputs
  • Measurement of harmonic content for I and V
  • Measurement of earth fault

These parameters are available for both medium- and low-voltage power lines.

OPTICAL SENSOR TECHNOLOGY

With its stability and linearity, PowerSense’s cutting-edge sensor technology is setting new standards for current measurements in general. For PowerSense’s primary business area of MV grid monitoring in particular, it is creating a completely new set of standards for how to monitor the MV power grid.

The DISCOS® Current Sensor is part of the DISCOS® Opti module. The DISCOS® Sensor monitors the current size and angle on both the LV and MV side of the transformer.

BASED ON THE FARADAY EFFECT

Today, only a few applications in measuring instruments are based on the Faraday rotation principle. For instance, the Faraday effect has been used for measuring optical rotary power, for amplitude modulation of light and for remote sensing of magnetic fields.

now, due to advanced computing techniques, PowerSense is able to offer a low-priced optical sensor based on the Faraday effect.

THE COMPANY

PowerSense A/S was established on September 1, 2006, by DONG Energy A/S (formerly Nesa A/S) as a spin-off of the DISCOS® product line business. The purpose of the spin-off was to ensure the best future business conditions for the DISCOS® product line.

After the spin-off, BankInvest A/S, a Danish investment bank, holds 70 percent of the share capital. DONG Energy A/S continues to hold 30 percent of the share capital.

Customer Service in the Brave New World of Today’s Utilities

A NEW GENERATION OF CUSTOMER

Today’s utility customers are energy dependant, information driven, technologically advanced, willing to change and environmentally friendly. Their grandparents prompted utilities to develop and offer levelized billing, and their parents created the need for online bill presentment and credit card payment. This new generation of customer is about to usher in a brave new world of utility customer service in which the real-time utility will conduct business 24 hours a day, seven days a week, 365 days a year, and Internet-savvy consumers will have all the capabilities of the current customer service representative. They’ll be able to receive pricing signals and control their utility usage via Internet portals, as well as shop among utilities for the best price and switch providers.

Expectations of system reliability are high today. Ten years ago, when the customer called to let you know their power was out, the call took 20 seconds; today, they expect you to already know that their power is out and be able to provide additional information about the nature and duration of that outage. What’s wrong? Are crews on the way? What’s the ETR? Can you text me when it’s back on? The call that includes these questions (and more) takes three times as long as that phone call 10 years ago. Thankfully, utility technology is coming of age just in time to meet the needs of evolving utility customers.

Many utilities already use automated circuit switchers to monitor lines for potential fault conditions and to react in real time to isolate faults and restore power. Automated metering systems send out “last gasp” outage notifications to outage management systems to predict the location of a problem for quicker restoration of service. Two-way communications systems send signals to smart appliances, system monitoring devices and customer messaging orbs to affect customer usage patterns. Fiber-to-the-home (FTTH) and wireless systems communicate meter usage in near real time to enable monitoring for abnormal consumption patterns. If customers have all of this data at their fingertips, what more will they expect from their utility service professionals? Advanced metering infrastructure (AMI) and two-way communications between customer and utility provider are essential to the future of these innovations. Figure 1 indicates the penetration of advanced metering by region.

A TOUCH OF ORWELL

This brave new world is not without risk. Tremendous amounts of data will be acquired and maintained. Monthly usage habits of consumers can provide incredible insight into customers’ lives – imagine the knowledge that real-time data can provide. As marketers begin to understand the powerful communications channels utilities possess, partnerships will emerge to maximize their value. Privacy laws and regulations defining proper use and misuse of data similar to Customer Private Network Information (CPNI) legislation will emerge just as they did in the telecommunications industry. Thus, it would be wise for the utility industry to take steps to limit use prior to legislative mandates being enacted that would create barriers to practical use.

EMERGING BUSINESSES CREATING VALUE FOR CUSTOMERS

Many of the technologies discussed in this paper already exist; the future will simply make their application more common – the interesting part will come in seeing how these products and services are bundled and who will provide them. Over the next 10 years, many new services (and a few new spins on old ones) will be offered to the consumer via this new infrastructure. The array of service offerings will be as broad as the capabilities that are created through the utilities infrastructure design. Utilities offering only one-way communication from the meter will be limited, while utilities with two-way communication riding their own fiber-optic systems will find a vast number of opportunities. Some of these services will fall within the core competency of the utility and be a natural fit in creating new revenue streams; others will require new partnerships to enable their existence. Some will span residential, commercial and industrial market segments, while others will be tailored to the residential customer only.

Energy management and consulting services will flourish during the initial period, especially in areas where time-of-use rates are incorporated in all market segments. Cable, Internet, telephone and security services will consolidate in areas where fiber-to-the-home is part of the infrastructure. Utilities’ ability to provide these services may be greatly effected by their legal and regulatory structures. Where limitations are imposed related to scope and type of services, partnerships will be formed to enable cost-effective service. Figure 2 shows what utilities reported to be the most common AMI system usages in a recent Federal Energy Regulatory Commission (FERC) survey.

As shown in Figure 2, load control, demand response monitoring and notification of price changes are already a part of the system capabilities. As an awareness of energy efficiency develops, a new focus on conservation will give rise to a newfound interest in smart appliances. Their operational characteristics will be more sophisticated than the predecessors of the “cycle and save” era, and they will meet customers’ demand for energy savings and environmental friendliness. This will not be limited to water heaters and heating, venting and air-conditioning (HVAC) units. The new initiatives will encompass refrigerators, freezers, washers, dryers and other second-order appliances, driving conservation derived from time-of-day use to a new level. And these initiatives will not be limited to electricity.

IMPACTS OF TECHNOLOGICAL CHANGE ON OTHER UTILITIES

Very few utility services will be exempt from the impact of changes in the electric industry. Natural gas and water usage, too, will be impacted as the nation focuses its attention on the efficient use of resources. Natural gas time-of-use rates will rise along with interruptible rates for residential consumers. This may take 10 to 15 years to occur, and a declining usage trend will need to be reversed; however, the same infrastructure restraints and concerns that plague the electric industry will be recognized in the natural gas industry as well. Thus, we can expect energy providers to adopt these rates in the future to stay competitive. If the electric systems are able to shift peak usage and levelize loads, the need for natural gas-fired generation will diminish. Natural gas-fired generation plants for system peaking would become unnecessary, and the decrease in demand would assist in stabilizing natural gas pricing.

Water availability issues are no longer limited to the Western United States, with areas such as Atlanta now beginning to experience water shortages as well. As a result, reverse-step rates that encourage water usage are being replaced with fixed and progressive step-rate structures to encourage water conservation. Automated metering can assist in eliminating waste, identifying excessive use during curtailment periods and creating a more efficient water distribution system. As energy time-of-use rates are implemented, water and wastewater treatment plants may find efficiencies in offering time-of-use rates as well in order to shape the usage characteristics of their customers without adding increased facilities. Even if this does not occur, time-of-use shifting of electrical load will have an impact on water usage patterns and effectively change water and wastewater operational characteristics.

In a world of increasing environmental vulnerability, the ability to monitor backflow in water metering will be essential in our efforts to be environmentally safe and monitor domestic threats to the water supply. Although technology’s ability to identify such threats will not prevent their occurrence, it will help utilities evaluate events and respond in order to isolate and diminish possible future threats.

IMPLICATIONS FOR UTILITIES

The above-described technological innovations don’t come without an impact to the service side of utilities. It will be difficult at best for utilities to modify legacy systems to take advantage of the benefits found in new technologies. More robust computer systems implemented in preparation for Y2K will be capable of some modifications; however, new software offerings are being designed today to address the vast opportunities that will soon exist. Processes for data management, storage and retrieval and use will need to be developed. And a new breed of customer service representative will begin to evolve. New technologies, near realtime information available to the consumer, unique customer and appliance configurations, and partnerships and services that go beyond the core competencies of the current workforce will create a short-term gap in trained customer service professionals. Billing departments will expand as rates become more complex. And the increased flexibility of customer information systems will require extensive checks and verifications to ensure accuracy.

Figure 3 (created by Robert Pratt of Pacific Northwest National Laboratory) provides a picture of the new landscape being created by the technologies utilities are implementing and the implications they have for customers.

Utilities with completely integrated systems will be the biggest winners in the future. Network management; geographic information systems; customer information systems; work order systems; supervisory control and data acquisition (SCADA) systems; and financial systems that communicate openly will be positioned to recognize the early wins that will spark the next decade of innovation. Cost-to-serve models continue to resonate as a popular topic among utility providers, and the impact of new technology will assist in making this integral to financial success.

The processes underlying current policies and procedures were designed for the way utilities traditionally operated – which is precisely why today’s utilities must take a systematic approach to re-evaluating their business processes if they’re to take advantage of new technology. They’ll even need to consider the cost of providing a detailed bill and mail delivery. The existence of real-time readings may bring dramatic changes in payment processing. Prepay accounts may eliminate the need to require deposits or assume risk for uncollectible accounts. Daily, weekly and semi-monthly payments may bring added cost (as may allowing customers to choose their due dates in the traditional arrears billing model); thus, utilities must consider the implications of these actions on cash fl ow and risk before implementing them. Advance notice of service interruption due to planned maintenance or construction can be communicated electronically over two-way automated meter reading (AMR) systems to orbs, communication panels, computers or other means. These same capabilities will dramatically change credit and collections efforts over the next 10 years. Electronic notification of past due accounts, shut-off and reconnection can all be done remotely at little cost to the utility.

IMPLICATIONS FOR CONSUMERS

Customers and commercial marketing efforts will be the driving forces for much of the innovation we’ll witness in coming years. No longer are customers simply comparing utilities against each other; today, they’re comparing utility customer service with their best and worst customer experiences regardless of industry. This means that customers are comparing a utility’s website capabilities with Amazon. com and its service response with the Ritz Carlton, Holiday Inn or Marriott they might frequent. Service reliability is measured against FedEx. Customer service expectations are raised with every initiative of competitive enterprise – a fact utilities will have to come to terms with if they’re to succeed.

All customers are not created equal. Technologically advanced customers will find the future exciting, while customers who view their utility as just another service provider will find it complicated and at times overwhelming. Utilities must communicate with customers at all levels to adequately prepare them for a future that’s already arrived.

Achieving Decentralized Coordination In the Electric Power Industry

For the past century, the dominant business and regulatory paradigms in the electric power industry have been centralized economic and physical control. The ideas presented here and in my forthcoming book, Deregulation, Innovation, and Market Liberalization: Electricity Restructuring in a Constantly Evolving Environment (Routledge, 2008), comprise a different paradigm – decentralized economic and physical coordination – which will be achieved through contracts, transactions, price signals and integrated intertemporal wholesale and retail markets. Digital communication technologies – which are becoming ever more pervasive and affordable – are what make this decentralized coordination possible. In contrast to the “distributed control” concept often invoked by power systems engineers (in which distributed technology is used to enhance centralized control of a system), “decentralized coordination” represents a paradigm in which distributed agents themselves control part of the system, and in aggregate, their actions produce order: emergent order. [1]

Dynamic retail pricing, retail product differentiation and complementary end-use technologies provide the foundation for achieving decentralized coordination in the electric power industry. They bring timely information to consumers and enable them to participate in retail market processes; they also enable retailers to discover and satisfy the heterogeneous preferences of consumers, all of whom have private knowledge that’s unavailable to firms and regulators in the absence of such market processes. Institutions that facilitate this discovery through dynamic pricing and technology are crucial for achieving decentralized coordination. Thus, retail restructuring that allows dynamic pricing and product differentiation, doesn’t stifle the adoption of digital technology and reduces retail entry barriers is necessary if this value-creating decentralized coordination is to happen.

This paper presents a case study – the “GridWise Olympic Peninsula Testbed Demonstration Project” – that illustrates how digital end-use technology and dynamic pricing combine to provide value to residential customers while increasing network reliability and reducing required infrastructure investments through decentralized coordination. The availability (and increasing cost-effectiveness) of digital technologies enabling consumers to monitor and control their energy use and to see transparent price signals has made existing retail rate regulation obsolete. Instead, the policy recommendation that this analysis implies is that regulators should reduce entry barriers in retail markets and allow for dynamic pricing and product differentiation, which are the keys to achieving decentralized coordination.

THE KEYS: DYNAMIC PRICING, DIGITAL TECHNOLOGY

Dynamic pricing provides price signals that reflect variations in the actual costs and benefits of providing electricity at different times of the day. Some of the more sophisticated forms of dynamic pricing harness the dramatic improvements in information technology of the past 20 years to communicate these price signals to consumers. These same technological developments also give consumers a tool for managing their energy use, in either manual or automated form. Currently, with almost all U.S. consumers (even industrial and commercial ones) paying average prices, there’s little incentive for consumers to manage their consumption and shift it away from peak hours. This inelastic demand leads to more capital investment in power plants and transmission and distribution facilities than would occur if consumers could make choices based on their preferences and in the face of dynamic pricing.

Retail price regulation stifles the economic processes that lead to both static and dynamic efficiency. Keeping retail prices fixed truncates the information flow between wholesale and retail markets, and leads to inefficiency, price spikes and price volatility. Fixed retail rates for electric power service mean that the prices individual consumers pay bear little or no relation to the marginal cost of providing power in any given hour. Moreover, because retail prices don’t fluctuate, consumers are given no incentive to change their consumption as the marginal cost of producing electricity changes. This severing of incentives leads to inefficient energy consumption in the short run and also causes inappropriate investment in generation, transmission and distribution capacity in the long run. It has also stifled the implementation of technologies that enable customers to make active consumption decisions, even though communication technologies have become ubiquitous, affordable and user-friendly.

Dynamic pricing can include time-of-use (TOU) rates, which are different prices in blocks over a day (based on expected wholesale prices), or real-time pricing (RTP) in which actual market prices are transmitted to consumers, generally in increments of an hour or less. A TOU rate typically applies predetermined prices to specific time periods by day and by season. RTP differs from TOU mainly because RTP exposes consumers to unexpected variations (positive and negative) due to demand conditions, weather and other factors. In a sense, fixed retail rates and RTP are the end points of a continuum of how much price variability the consumer sees, and different types of TOU systems are points on that continuum. Thus, RTP is but one example of dynamic pricing. Both RTP and TOU provide better price signals to customers than current regulated average prices do. They also enable companies to sell, and customers to purchase, electric power service as a differentiated product.

TECHNOLOGY’S ROLE IN RETAIL CHOICE

Digital technologies are becoming increasingly available to reduce the cost of sending prices to people and their devices. The 2007 Galvin Electricity Initiative report “The Path to Perfect Power: New Technologies Advance Consumer Control” catalogs a variety of end-user technologies (from price-responsive appliances to wireless home automation systems) that can communicate electricity price signals to consumers, retain data on their consumption and be programmed to respond automatically to trigger prices that the consumer chooses based on his or her preferences. [2] Moreover, the two-way communication advanced metering infrastructure (AMI) that enables a retailer and consumer to have that data transparency is also proliferating (albeit slowly) and declining in price.

Dynamic pricing and the digital technology that enables communication of price information are symbiotic. Dynamic pricing in the absence of enabling technology is meaningless. Likewise, technology without economic signals to respond to is extremely limited in its ability to coordinate buyers and sellers in a way that optimizes network quality and resource use. [3] The combination of dynamic pricing and enabling technology changes the value proposition for the consumer from “I flip the switch, and the light comes on” to a more diverse and consumer-focused set of value-added services.

These diverse value-added services empower consumers and enable them to control their electricity choices with more granularity and precision than the environment in which they think solely of the total amount of electricity they consume. Digital metering and end-user devices also decrease transaction costs between buyers and sellers, lowering barriers to exchange and to the formation of particular markets and products.

Whether they take the form of building control systems that enable the consumer to see the amount of power used by each function performed in a building or appliances that can be programmed to behave differently based on changes in the retail price of electricity, these products and services provide customers with an opportunity to make better choices with more precision than ever before. In aggregate, these choices lead to better capacity utilization and better fuel resource utilization, and provide incentives for innovation to meet customers’ needs and capture their imaginations. In this sense, technological innovation and dynamic retail electricity pricing are at the heart of decentralized coordination in the electric power network.

EVIDENCE

Led by the Pacific Northwest National Laboratory (PNNL), the Olympic Peninsula GridWise Testbed Project served as a demonstration project to test a residential network with highly distributed intelligence and market-based dynamic pricing. [4] Washington’s Olympic Peninsula is an area of great scenic beauty, with population centers concentrated on the northern edge. The peninsula’s electricity distribution network is connected to the rest of the network through a single distribution substation. While the peninsula is experiencing economic growth and associated growth in electricity demand, the natural beauty of the area and other environmental concerns served as an impetus for area residents to explore options beyond simply building generation capacity on the peninsula or adding transmission capacity.

Thus, this project tested how the combination of enabling technologies and market-based dynamic pricing affected utilization of existing capacity, deferral of capital investment and the ability of distributed demand-side and supply-side resources to create system reliability. Two questions were of primary interest:

1) What dynamic pricing contracts do consumers find attractive, and how does enabling technology affect that choice?

2) To what extent will consumers choose to automate energy use decisions?

The project – which ran from April 2006 through March 2007 – included 130 broadband-enabled households with electric heating. Each household received a programmable communicating thermostat (PCT) with a visual user interface that allowed the consumer to program the thermostat for the home – specifically to respond to price signals, if desired. Households also received water heaters equipped with a GridFriendly appliance (GFA) controller chip developed at PNNL that enables the water heater to receive price signals and be programmed to respond automatically to those price signals. Consumers could control the sensitivity of the water heater through the PCT settings.

These households also participated in a market field experiment involving dynamic pricing. While they continued to purchase energy from their local utility at a fixed, discounted price, they also received a cash account with a predetermined balance, which was replenished quarterly. The energy use decisions they made would determine their overall bill, which was deducted from their cash account, and they were able to keep any difference as profit. The worst a household could do was a zero balance, so they were no worse off than if they had not participated in the experiment. At any time customers could log in to a secure website to see their current balances and determine the effectiveness of their energy use strategies.

On signing up for the project, the households received extensive information and education about the technologies available to them and the kinds of energy use strategies facilitated by these technologies. They were then asked to choose a retail pricing contract from three options: a fixed price contract (with an embedded price risk premium), a TOU contract with a variable critical peak price (CPP) component that could be called in periods of tight capacity or an RTP contract that would reflect a wholesale market-clearing price in five-minute intervals. The RTP was determined using a uniform price double auction in which buyers (households and commercial) submit bids and sellers submit offers simultaneously. This project represented the first instance in which a double auction retail market design was tested in electric power.

The households ranked the contracts and were then divided fairly evenly among the three types, along with a control group that received the enabling technologies and had their energy use monitored but did not participate in the dynamic pricing market experiment. All households received either their first or second choice; interestingly, more than two-thirds of the households ranked RTP as their first choice. This result counters the received wisdom that residential customers want only reliable service at low, stable prices.

According to the 2007 report on the project by D.J. Hammerstrom (and others), on average participants saved 10 percent on their electricity bills. [5] That report also includes the following findings about the project:

Result 1. For the RTP group, peak consumption decreased by 15 to 17 percent relative to what the peak would have been in the absence of the dynamic pricing – even though their overall energy consumption increased by approximately 4 percent. This flattening of the load duration curve indicates shifting some peak demand to nonpeak hours. Such shifting increases the system’s load factor, improving capacity utilization and reducing the need to invest in additional capacity, for a given level of demand. A 15 to 17 percent reduction is substantial and is similar in magnitude to the reductions seen in other dynamic pricing pilots.

After controlling for price response, weather effects and weekend days, the RTP group’s overall energy consumption was 4 percent higher than that of the fixed price group. This result, in combination with the load duration effect noted above, indicates that the overall effect of RTP dynamic pricing is to smooth consumption over time, not decrease it.

Result 2. The TOU group achieved both a large price elasticity of demand (-0.17), based on hourly data, and an overall energy reduction of approximately 20 percent relative to the fixed price group.

After controlling for price response, weather effects and weekend days, the TOU group’s overall energy consumption was 20 percent lower than that of the fixed price group. This result indicates that the TOU (with occasional critical peaks) pricing induced overall conservation – a result consistent with the results of the California SPP project. The estimated price elasticity of demand in the TOU group was -0.17, which is high relative to that observed in other projects. This elasticity suggests that the pricing coupled with the enabling end-use technology amplifies the price responsiveness of even small residential consumers.

Despite these results, dynamic pricing and enabling technologies are proliferating slowly in the electricity industry. Proliferation requires a combination of formal and informal institutional change to overcome a variety of barriers. And while formal institutional change (primarily in the form of federal legislation) is reducing some of these barriers, it remains an incremental process. The traditional rate structure, fixed by state regulation and slow to change, presents a substantial barrier. Predetermined load profiles inhibit market-based pricing by ignoring individual customer variation and the information that customers can communicate through choices in response to price signals. Furthermore, the persistence of standard offer service at a discounted rate (that is, a rate that does not reflect the financial cost of insurance against price risk) stifles any incentive customers might have to pursue other pricing options.

The most significant – yet also most intangible and difficult-to-overcome – obstacle to dynamic pricing and enabling technologies is inertia. All of the primary stakeholders in the industry – utilities, regulators and customers – harbor status quo bias. Incumbent utilities face incentives to maintain the regulated status quo as much as possible (given the economic, technological and demographic changes surrounding them) – and thus far, they’ve been successful in using the political process to achieve this objective.

Customer inertia also runs deep because consumers have not had to think about their consumption of electricity or the price they pay for it – a bias consumer advocates generally reinforce by arguing that low, stable prices for highly reliable power are an entitlement. Regulators and customers value the stability and predictability that have arisen from this vertically integrated, historically supply-oriented and reliability-focused environment; however, what is unseen and unaccounted for is the opportunity cost of such predictability – the foregone value creation in innovative services, empowerment of customers to manage their own energy use and use of double-sided markets to enhance market efficiency and network reliability. Compare this unseen potential with the value creation in telecommunications, where even young adults can understand and adapt to cell phone-pricing plans and benefit from the stream of innovations in the industry.

CONCLUSION

The potential for a highly distributed, decentralized network of devices automated to respond to price signals creates new policy and research questions. Do individuals automate sending prices to devices? If so, do they adjust settings, and how? Does the combination of price effects and innovation increase total surplus, including consumer surplus? In aggregate, do these distributed actions create emergent order in the form of system reliability?

Answering these questions requires thinking about the diffuse and private nature of the knowledge embedded in the network, and the extent to which such a network becomes a complex adaptive system. Technology helps determine whether decentralized coordination and emergent order are possible; the dramatic transformation of digital technology in the past few decades has decreased transaction costs and increased the extent of feasible decentralized coordination in this industry. Institutions – which structure and shape the contexts in which such processes occur – provide a means for creating this coordination. And finally, regulatory institutions affect whether or not this coordination can occur.

For this reason, effective regulation should focus not on allocation but rather on decentralized coordination and how to bring it about. This in turn means a focus on market processes, which are adaptive institutions that evolve along with technological change. Regulatory institutions should also be adaptive, and policymakers should view regulatory policy as work in progress so that the institutions can adapt to unknown and changing conditions and enable decentralized coordination.

ENDNOTES

1. Order can take many forms in a complex system like electricity – for example, keeping the lights on (short-term reliability), achieving economic efficiency, optimizing transmission congestion, longer-term resource adequacy and so on.

2. Roger W. Gale, Jean-Louis Poirier, Lynne Kiesling and David Bodde, “The Path to Perfect Power: New Technologies Advance Consumer Control,” Galvin Electricity Initiative report (2007). www.galvinpower.org/resources/galvin.php?id=88

3. The exception to this claim is the TOU contract, where the rate structure is known in advance. However, even on such a simple dynamic pricing contract, devices that allow customers to see their consumption and expenditure in real time instead of waiting for their bill can change behavior.

4. D.J. Hammerstrom et. al, “Pacific Northwest GridWise Testbed Demonstration Projects, volume I: The Olympic Peninsula Project” (2007). http://gridwise.pnl.gov/docs/op_project_final_report_pnnl17167.pdf

5. Ibid.

How Intelligent Is Your Grid?

Many people in the utility industry see the intelligent grid — an electric transmission and distribution network that uses information technology to predict and adjust to network changes — as a long-term goal that utilities are still far from achieving. Energy Insights research, however, indicates that today’s grid is more intelligent than people think. In fact, utilities can begin having the network of the future today by better leveraging their existing resources and focusing on the intelligent-grid backbone.

DRIVERS FOR THE INTELLIGENT GRID

Before discussing the intelligent grid backbone, it’s important to understand the drivers directing the intelligent grid’s progress. While many groups — such as government, utilities and technology companies — may be pushing the intelligent grid forward, they are also slowing it down. Here’s how:

  • Government. With the 2005 U.S. Energy Policy Act and the more recent 2007 Energy Independence and Security Act, the federal government has acknowledged the intelligent grid’s importance and is supporting investment in the area. Furthermore, public utility commissions (PUCs) have begun supporting intelligent grid investments like smart metering. At the same time, however, PUCs have a duty to maintain reasonable prices. Since utilities have not extensively tested the benefits of some intelligent grid technologies, such as distribution line sensors, many regulators hesitate to support utilities investing in intelligent grid technologies beyond smart metering.
  • Utilities. Energy Insights research indicates that information technology, in general, enables utilities to increase operational efficiency and reduce costs. For this reason, utilities are open to information technology; however, they’re often looking for quick cost recovery and benefits. Many intelligent grid technologies provide longer-term benefits, making them difficult to cost-justify over the short term. Since utilities are risk-aware, this can make intelligent grid investments look riskier than traditional information technology investments.
  • Technology. Although advanced enough to function on the grid today, many intelligent grid technologies could become quickly outdated thanks to the rapidly developing marketplace. What’s more, the life span of many intelligent grid technologies is not as long as those of traditional grid assets. For example, a smart meter’s typical life span is about 10 to 15 years, compared with 20 to 30 years for an electro-mechanical meter.

With strong drivers and competing pressures like these, it’s not a question of whether the intelligent grid will happen but when utilities will implement new technologies. Given the challenges facing the intelligent grid, the transition will likely be more of an evolution than a revolution. As a result, utilities are making their grids more intelligent today by focusing on the basics, or the intelligent grid backbone.

THE INTELLIGENT GRID BACKBONE

What comprises this backbone? Answering this question requires a closer look at how intelligence changes the grid. Typically, a utility has good visibility into the operation of its generation and transmission infrastructure but poor visibility into its distribution network. As a result, the utility must respond to a changing distribution network based on very limited information. Furthermore, if a grid event requires attention — such as in the case of a transformer failure — people must review information, decide to act and then manually dispatch field crews. This type of approach translates to slower, less informed reactions to grid events.

The intelligent grid changes these reactions through a backbone of technologies — sensors, communication networks and advanced analytics — especially developed for distribution networks. To better understand these changes, we can imagine a scenario where a utility has an outage on its distribution network. As shown in Figure 1, additional grid sensors collect more information, making it easier to detect problems. Communications networks then allow sensors to convey the problem to the utility. Advanced analytics can efficiently process this information and determine more precisely where the fault is located, as well as automatically respond to the problem and dispatch field crews. These components not only enable faster, better-informed reactions to grid problems, they can also do real-time pricing, improve demand response and better handle distributed and renewable energy sources.

A CLOSER LOOK AT BACKBONE COMPONENTS

A deeper dive into each of these intelligent grid backbone technologies reveals how utilities are gaining more intelligence about their grid today.

Network sensors are important not only for real-time operations — such as locating faults and connecting distributed energy sources to the grid — but also for providing a rich historical data source to improve asset maintenance and load research and forecasting. Today, more utilities are using sensors to better monitor their distribution networks; however, they’re focused primarily on smart meters. The reason for this is that smart meters have immediate operational benefits that make them attractive for many utilities today, including reducing meter reader costs, offering accurate billing information, providing theft control and satisfying regulatory requirements. Yet this focus on smart meters has created a monitoring gap between the transmission network and the smart meter.

A slew of sensors are available from companies such as General Electric, ABB, PowerSense, GridSense and Serveron to fill this monitoring gap. Tracking everything from load balancing and transformer status to circuit breakers and tap changers, energized downed lines, high-impedance faults and stray voltage, and more, these sensors are able to fill the monitoring gap, yet utilities hesitate to invest in them because they lack the immediate operational benefits of smart meters.

By monitoring this gap, however, utilities will sustain longer-term grid benefits such as reduced generation capacity building. Utilities have found they can begin monitoring this gap by:

  • Prioritizing sensor investments. Customer complaints and regulatory pressure have pushed some utilities to take action for particular parts of their service territory. For example, one utility Energy Insights studied received numerous customer complaints about a particular feeder’s reliability, so the utility invested in line sensors for that area. Another utility began considering sensor investments in troubled areas of its distribution network when regulators demanded that the utility raise its System Average Interruption Frequency Index (SAIFI) and System Average Interruption Duration Index (SAIDI) ratings from the bottom 50 percent to the top 25 percent of benchmarked utilities. By focusing on such areas, utilities can achieve “quick wins” with sensors and build utility confidence by using additional sensors on their distribution grid.
  • Realizing it’s all about compromise. Even in high-priority areas, it may not make financial sense for a utility to deploy the full range of sensors for every possible asset. In some situations, utilities may target a particular area of the service territory with a higher density of sensors. For example, a large U.S. investor-owned utility with a medium voltage-sensing program placed a high density of sensors along a specific section of its service territory. On the other hand, utilities might cover a broader area of service territory with fewer sensors, similar to the approach taken by a large investor-owned utility Energy Insights looked at that monitored only transformers across its service territory.
  • Rolling in sensors with other intelligent grid initiatives. Some utilities find ways to combine their smart metering projects with other distribution network sensors or to leverage existing investments that could support additional sensors. One utility that Energy Insights looked at installed transformer sensors along with a smart meter initiative and leveraged the communications networks it used for smart metering.

While sensors provide an important means of capturing information about the grid, communication networks are critical to moving that information throughout the intelligent grid — whether between sensors or field crews. Typically, to enable intelligent grid communications, utilities must either build new communications networks to bring intelligence to the existing grid or incorporate communication networks into new construction. Yet utilities today are also leveraging existing or recently installed communications networks to facilitate more sophisticated intelligent grid initiatives such as the following:

  • Smart metering and automated meter-reading (AMR) initiatives. With the current drive to install smart meters, many utilities are covering their distribution networks with communications infrastructure. Furthermore, existing AMR deployments may include communications networks that can bring data back to the utility. Some utilities are taking advantage of these networks to begin plugging other sensors into their distribution networks.
  • Mobile workforce. The deployment of mobile technologies for field crews is another hot area for utilities right now. Utilities are deploying cellular networks for field crew communications for voice and data. Although utilities have typically been hesitant to work with third-party communications providers, they’ve become more comfortable with outside providers after using them for their mobile technologies. Since most of the cellular networks can provide data coverage as well, some utilities are beginning to use these providers to transmit sensor information across their distribution networks.

Since smart metering and mobile communications networks are already in place, the incremental cost of installing sensors on these networks is relatively low. The key is making sure that different sensors and components can plug into these networks easily (for example, using a standard communications protocol).

The last key piece of the intelligent grid backbone is advanced analytics. Utilities are required to make quick decisions every day if they’re to maintain a safe and reliable grid, and the key to making such decisions is being well informed. Intelligent grid analytics can help utilities quickly process large amounts of data from sensors so that they can make those informed decisions. However, how quickly a decision needs to be made depends on the situation. Intelligent grid analytics assist with two types of decisions: very quick decisions (veQuids) and quick decisions (Quids). veQuids are made in milliseconds by computers and intelligent devices analyzing complex, real-time data – an intelligent grid vision that’s still a future development for most utilities.

Fortunately, many proactive decisions about the grid don’t have to be made in milliseconds. Many utilities today can make Quids — often manual decisions — to predict and adjust to network changes within a time frame of minutes, days or even months.

no matter how quick the decision, however, all predictive efforts are based on access to good-quality data. In putting their Quid capabilities to use today — in particular for predictive maintenance and smart metering — utilities are building not only intelligence about their grids but also a foundation for providing more advanced veQuids analytics in the future through the following:

  • The information foundation. Smart metering and predictive maintenance require utilities to collect not only more data but also more real-time data. Smart metering also helps break down barriers between retail and operational data sources, which in turn creates better visibility across many data sources to provide a better understanding of a complex grid.
  • The automation transition. To make the leap between Quids and veQuids requires more than just better access to more information — it also requires automation. While fully automated decision-making is still a thing of the future, many utilities are taking steps to compile and display data automatically as well as do some basic analysis, using dashboards from providers such as OSIsoft and Obvient Strategies to display high-level information customized for individual users. The user then further analyzes the data, and makes decisions and takes action based on that analysis. Many utilities today use the dashboard model to monitor critical assets based on both real-time and historical data.

ENSURING A MORE INTELLIGENT GRID TODAY AND TOMORROW

As these backbone components show, utilities already have some intelligence on their grids. now, they’re building on that intelligence by leveraging existing infrastructure and resources — whether it’s voice communications providers for data transmission or Quid resources to build a foundation for the veQuids of tomorrow. In particular, utilities need to look at:

  • Scalability. Utilities need to make sure that whatever technologies they put on the grid today can grow to accommodate larger portions of the grid in future.
  • Flexibility. Given rapid technology changes in the marketplace, utilities need to make sure their technology is flexible and adaptable. For example, utilities should consider smart meters that have the ability to change out communications cards to allow for new technologies.
  • Integration. due to the evolutionary nature of the grid, and with so many intelligent grid components that must work together (intelligent sensors at substations, transformers and power lines; smart meters; and distributed and renewable energy sources), utilities need to make sure these disparate components can work with one another. Utilities need to consider how to introduce more flexibility into their intelligent grids to accommodate the increasingly complex network of devices.

As today’s utilities employ targeted efforts to build intelligence about the grid, they must keep in mind that whatever action they take today – no matter how small – must ultimately help them meet the demands of tomorrow.

Making Change Work: Why Utilities Need Change Management

Many times organizations are reluctant to engage change management programs, plans and teams. More often, change management programs are launched too late in the project process, are only moderately funded or are absorbed within the team as part-time responsibilities – all of which we’ve seen happen time and again in the utility industry.

“Making Change Work,” an IBM study done in collaboration with the Center of Evaluation and Methods at Bonn University, analyzed the factors for successful implementation of change. The scope of this study, released in 2007, is now being expanded because the project management and change management professions, formerly aligned, are now at a turning point of differentiation. The reason is simple: too many projects fail to consider both components as critical to success – and therefore lack insight into the day-today impact of a change on members of the organization.

Despite this, many organizations have been reluctant to implement change management programs, plans and teams. And when they have put such programs in place, the programs tend to be launched too late in the project process, are inadequately funded or are perceived as part-time tasks that can be assigned to members of the project management team.

WHAT IS CHANGE MANAGEMENT?

Change management is a structured approach to business transformation that manages the transition from a current state to a desired future state. Far from being static or rigid, change management is an ever-evolving program that varies with the needs of the organization. Effective change management involves people and provides open communication.

Change management is as important as project management. However, whereas project management is a tactical activity, change management represents a strategic initiative. To understand the difference, consider the following

  • Change management is the process of driving corporate strategy by identifying, addressing and managing barriers to change across the organization or enterprise.
  • Project management is the process of implementing the tools needed to enable or mobilize the corporate strategy.

Change management is an ongoing process that works in close concert with project management. At any given time at least one phase of change management should be occurring. More likely, multiple phases will be taking place across various initiatives.

A change management program can be tailored to manage the needs of the organizational culture and relationships. The program must close the gaps among workforce, project team and sponsor leadership during all phases of all projects. It does this by:

  • Ensuring proper alignment of the organization with new technology and process requirements;
  • Preparing people for new processes and technology through training and communication;
  • Identifying and addressing human resource implications such as job definitions, union negotiations and performance measures;
  • Managing the reaction of both individuals and the entire organization to change; and
  • Providing the right level of support for ongoing implementation success.

The three fundamental activities of a change management program are leading, communicating and engaging. These three activities should span the project life cycle to maintain both awareness of the change and its momentum (Figure 1).

KEY ELEMENTS OF A CHANGE PROGRAM

There are three best practice elements that make the difference between successful projects and less successful projects: [1]

Organizational awareness for the challenges inherent in any change. This involves the following:

  • Getting a real understanding of – and leadership buy-in to – the stakeholders and culture;
  • Recognizing the interdependence of strategy and execution;
  • Ensuring an integrated strategy approach linking business strategy, operations, organization design and change and technology strategy; and
  • Educating leadership on change requirements and commitment.

Consistent use of formal methods for change management. This should include:

  • Covering the complete life cycle – from definition to deployment to post-implementation optimization;
  • Allowing for easy customization and flexibility through a modular design;
  • Incorporating change management and value realization components into each phase to increase the likelihood of success; and
  • Providing a published plan with ongoing accountability and sponsorship as well as continuous improvement.

A specified share of the project budget that is invested in change management. This should involve:

  • Investing in change linked to project success. Projects that invest more than 10 percent of the project budget have an average of 45 percent success (Figure 2). [2]
  • Assigning the right resources to support change management early on and maintaining the required support. This also limits the adverse impacts of change on an organization’s productivity (Figure 3). [3]

WHY DO UTILITIES NEED CHANGE MANAGEMENT?

Utilities today face a unique set of challenges. For starters, they’re simultaneously dealing with aging infrastructures and aging workforces. In addition, there are market pressures to improve performance, become more “green” and mitigate rising energy costs. To address these realities, many utilities are seeking mergers and acquisition (M&A) opportunities as well as implementing new technologies.

The cost cutting of the past decade combined with M&As has left utilities with gaps in workforce experience as well as budget challenges. Yet utilities are facing major business disruptions going into the next decade and beyond. To cope with these disruptions, companies are implementing new technologies such as the intelligent grid, advanced metering infrastructure (AMI), meter data management (MDM), enterprise asset management (EAM) and work management systems (WMS’s). It’s not uncommon for utilities to be implementing multiple new systems simultaneously that affect the day-to-day activities of people throughout the organization, from frontline workers to senior managers.

A change management program can address a number of challenges specific to the utilities industry.

CULTURAL CLIMATE: ‘BUT WE’RE DIFFERENT’

A utility is a utility is a utility. But a deeper look into individual businesses reveals nuances in their relationships with both internal and external stakeholders that are unique to each company. A change management team must intimately understand these relationships. For example, externally how is the utility perceived by regulators, customers, the community and even analysts? As for internal relationships, how do various operating divisions relate and work together? Some operating divisions work well together on project teams and respect each other and their differences; others do not.

There may be cultural differences, but work is work. Only change management can address these relationships. Knowing the utility’s cultural climate and relationships will help shape each phase of the change management program, and allow change management professionals to customize a project or system implementation to fit a company’s culture.

REGULATORY LANDSCAPE

With M&As and increasing market pressures across the United States, the regulatory landscape confronting utilities is becoming more variable. We’ve seen several types of regulatory-related challenges.

Regulatory pressure. Whether regulators mandate or simply encourage new technology implementations can make a significant difference in how stakeholders in a project behave. In general, there’s more resistance to a new technology when it’s required versus voluntarily implemented. Change management can help work through participant behaviors and mitigate obstacles so that project work can continue as planned.

Multiple regulatory jurisdictions. Many utilities with recently expanded footprints following M&As now have to manage requests from and expectations of multiple regulatory commissions. Often these commissions have different mandates. Change management initiatives are needed to work through the complexity of expectations, manage multiple regulatory relationships and drive utilities toward a unified corporate strategy.

Regulatory evolution. Just as markets evolve, so do regulatory influences and mandates. Often regulators will issue orders that can be interpreted in many ways. They may even do this to get information in the form of reactions from their various constituents. Whatever the reason, the reality is that utilities are managing an ever-changing portfolio of regulations. Change management can better prepare utilities for this constant change.

OPERATIONS MATURITY

When new systems and technologies being implemented encompass multiple operating divisions, it can be difficult for stakeholders to agree on operating standards or processes. Project team members representing the various operating regions can resist compromise for fear of losing control. This often occurs when utilities are attempting to integrate systems across operating regions following an acquisition.

Change management helps ensure that various constituents – for example, the regional operating divisions – are prepared for eminent business transformation. In large organizations, this preparation period can take a year or more. But for organizations to realize the benefits of new systems and technology implementations, they must be ready to receive the benefits. Readiness and preparedness are largely the responsibilities of the change management team.

ORGANIZATIONAL COHESIVENESS

The notion of organizational cohesiveness is that across the organization all constituents are equally committed to the business transformation initiative and have the same understanding of the overarching corporate strategy while also performing their individual roles and responsibilities.

Senior executives must align their visions and common commitment to change. After all, they set the tone for change through their respective organizations. If they are not in sync with each other, their organizations become silos, and business processes are less likely to be fluid across organizational boundaries. Frontline managers and associates must, in turn, be engaged and enthusiastic about the transformations to come.

Organizational cohesiveness is especially critical during large systems implementations involving utility field operations. Leaders at multiple locations must be ready to communicate and support change – and this support must be visible to the workforce. Utilities must understand this requirement at the beginning of a project to make change manageable, realistic and personal enough to sustain momentum. All too often, we’ve heard team members comment, “We had a lot of leadership at the project kickoff, but we really haven’t seen leadership at any of our activities or work locations since then. The project team tells us what to do.”

Moreover, leadership – when removed from the project – usually will not admit that they’re in the dark about what’s going on. Yet their lack of involvement will not escape the attention of frontline employees. Once the supervisor is perceived as lacking information – and therefore power – it’s all over. Improving customer service and quality, cutting costs and adopting new technology-merging operations all require changing employees. [4]

For utilities, the concept of organizational cohesiveness is especially important because just as much technology “lives” outside IT as inside. Yet the engineers who use this non-IT-controlled technology – what Gartner calls “operations technology” – are usually disconnected from the IT world in terms of both practical planning and execution. However, these worlds must act as one for a company to be truly agile. [5]

Change management methods and tools ensure that organization cohesiveness exists through project implementation and beyond.

UNION ENGAGEMENT

Successful change occurs with a sustained partnership among union representatives throughout the project life cycle. Project leadership and union leadership must work together and partner to implement change. Union representation should be on the project team. Representatives can be involved in process reviews, testing and training, or asked to serve as change champions. In addition, communication is critical throughout all phases of a project. Frontline employees must see real evidence of how this change will benefit them. Change is personal: everyone wants to know how his or her job will be impacted.

There should also be union representation in training activities, since workers tend to be more receptive to peer-to-peer support. Utilities should, for example, engage union change champions to help co-workers during training and to be site “go to” representatives. Utilities should also provide advance training and recognize all who participate in it.

Union representatives should also participate in design and/or testing, since they will be able to pinpoint issues that will impact routine daily tasks. It could be something as simple as changing screen labels per their recommendation to increase user understanding.

More than one union workforce may be involved in a project. Location cultures that exist in large service territories or that have resulted from mergers may try to isolate themselves from the project team and resist change. Utilities should assemble a team from various work groups and then do the following to address the history and differences in the workforce:

  • Request ongoing union participation throughout the life of the project.
  • Include union roles as part of the project charter and define these roles with union leadership.
  • Provide a kickoff overview to union leadership.
  • Include union representation in work process development with balanced representation from various areas. Union employees know the job and can quickly identify the pros and cons of work tasks. A structured facilitation process and issue resolution process is required.
  • Assign a corporate human resource or labor relations role to review processes that impact the union workforce.
  • Develop communication campaigns that address union concerns, such as conducting face-to-face presentations at employing locations and educating union leaders prior to each change rollout.
  • Involve union representatives in training and user support.

Change management is necessary to sort through the relationships of multiple union workforces so that projects and systems can be implemented.

AN AGING WORKFORCE

A successful change management program will help mitigate the aging workforce challenges utilities will be facing for many years to come.

WHAT TO EXPECT FROM A SUCCESSFUL CHANGE MANAGEMENT PROGRAM

The result of a successful change management program is a flexible organization that’s responsive to customer needs, regulatory mandates and market pressures, and readily embraces new technologies and systems. A change-ready organization anticipates, expects and is increasingly comfortable with change and exhibits the following characteristics:

  • The organization is aligned.
  • The leaders are committed.
  • Business processes are developed and defined across all operational units.
  • Associates at all levels have received communications and have continued access to resources.

Facing major business transformations and unique industry challenges, utilities cannot afford not to engage change management programs. This skill set is just as critical as any other role in your organization. Change is a cost. Change should be part of the project budget.

Change is an ongoing, long-term investment. Good change management designed specifically for your culture and challenges minimizes change’s adverse effect on daily productivity and helps you reach and sustain project goals.

ENDNOTES

  1. “Making Change Work” (an IBM study), Center of Evaluation and Methods, Bonn University, 2007; excerpts from “IBM Integrated Strategy and Change Methodology,” 2007.
  2. “Making Change Work,” Center of Evaluation and Methods, Bonn University, 2007.
  3. Ibid.
  4. T.J. Larkin and Sandar Larkin, “Communicating Change: Winning Employee Support for New Business Goals,” McGraw Hill, 1994, p. 31.
  5. K. Steenstrup, B. Williams, Z. Sumic, C. Moore; “Gartner’s Energy and Utilities Summit: Agility on Both Sides of the Divide”; Gartner Industry Research ID Number G00145388; Jan. 30, 2007; p. 2.
  6. P. R. Bruffy and J. Juliano, “Addressing the Aging Utility Workforce Challenge: ACT NOW,” Montgomery Research 2006 journal.

The Distributed Utility of the (Near) Future

The next 10 to 15 years will see major changes – what future historians might even call upheavals – in the way electricity is distributed to businesses and households throughout the United States. The exact nature of these changes and their long-term effect on the security and economic well-being of this country are difficult to predict. However, a consensus already exists among those working within the industry – as well as with politicians and regulators, economists, environmentalists and (increasingly) the general public – that these fundamental changes are inevitable.

This need for change is in evidence everywhere across the country. The February 26, 2008, temporary blackout in Florida served as just another warning that the existing paradigm is failing. Although at the time of this writing, the exact cause of that blackout had not yet been identified, the incident serves as a reminder that the nationwide interconnected transmission and distribution grid is no longer stable. To wit: disturbances in Florida on that Tuesday were noted and measured as far away as New York.

A FAILING MODEL

The existing paradigm of nationwide grid interconnection brought about primarily by the deregulation movement of the late 1990s emphasizes that electricity be generated at large plants in various parts of the country and then distributed nationwide. There are two reasons this paradigm is failing. First, the transmission and distribution system wasn’t designed to serve as a nationwide grid; it is aged and only marginally stable. Second, political, regulatory and social forces are making the construction of large generating plants increasingly difficult, expensive and eventually unfeasible.

The previous historic paradigm made each utility primarily responsible for generation, transmission and distribution in its own service territory; this had the benefit of localizing disturbances and fragmenting responsibility and expense. With loose interconnections to other states and regions, a disturbance in one area or a lack of resources in a different one had considerably less effect on other parts of the country, or even other parts of service territories.

For better or worse, we now have a nationwide interconnected grid – albeit one that was neither designed for the purpose nor serves it adequately. Although the existing grid can be improved, the expense would be massive, and probably cost prohibitive. Knowledgeable industry insiders, in fact, calculate that it would cost more than the current market value of all U.S. utilities combined to modernize the nationwide grid and replace its large generating facilities over the next 30 years. Obviously, the paradigm is going to have to change.

While the need for dramatic change is clear, though, what’s less clear is the direction that change should take. And time is running short: North American Electric Reliability Corp. (NERC) projects serious shortages in the nation’s electric supply by 2016. Utilities recognize the need; they just aren’t sure which way to jump first.

With a number of tipping points already reached (and the changes they describe continuing to accelerate), it’s easy to envision the scenario that’s about to unfold. Consider the following:

  • The United States stands to face a serious supply/demand disconnect within 10 years. Unless something dramatic happens, there simply won’t be nearly enough electricity to go around. Already, some parts of the country are feeling the pinch. And regulatory and legislative uncertainty (especially around global warming and environmental issues) makes it difficult for utilities to know what to do. Building new generation of any type other than “green energy” is extremely difficult, and green energy – which currently meets less than 3 percent of U.S. supply needs – cannot close the growing gap between supply and demand being projected by NERC. Specifically, green energy will not be able to replace the 50 percent of U.S. electricity currently supplied by coal within that 10-year time frame.
  • Fuel prices continue to escalate, and the reliability of the fuel supply continues to decline. In addition, increasing restrictions are being placed on fuel selection, especially coal.
  • A generation of utility workers is nearing retirement, and finding adequate replacements among the younger generation is proving increasingly difficult.
  • It’s extremely difficult to site new transmission – needed to deal with supply-and-demand issues. Even new Federal Energy Regulatory Commission (FERC) authority to authorize corridors is being met with virulent opposition.

SMART GRID NO SILVER BULLET

Distributed generation – including many smaller supply sources to replace fewer large ones – and “smart grids” (designed to enhance delivery efficiency and effectiveness) have been posited as solutions. However, although such solutions offer potential, they’re far from being in place today. At best, smart grids and smarter consumers are only part of the answer. They will help reduce demand (though probably not enough to make up the generation shortfall), and they’re both still evolving as concepts. While most utility executives recognize the problems, they continue to be uncertain about the solutions and have a considerable distance to go before implementing any of them, according to recent Sierra Energy Group surveys.

According to these surveys, more than 90 percent of utility executives now feel that the intelligent utility enterprise and smart grid (IUE/SG) – that is, the distributed utility – represents an inevitable part of their future (Figure 1). This finding was true of all utility types supplying electricity.

Although utility executives understand the problem and the IUE/SG approach to solving part of it, they’re behind in planning on exactly how to implement the various pieces. That “planning lag” for the vision can be seen in Figure 2.

At least some fault for the planning lag can be attributed to forces outside the utilities. While politicians and regulators have been emphasizing conservation and demand response, they’ve failed to produce guidelines for how this will work. And although a number of states have established mandatory green power percentages, Congress failed to do the same in an Energy Policy Act (EPACT) adopted in December 2007. While the EPACT of 2005 “urged” regulators to “urge” utilities to install smart meters, it didn’t make their installation a requirement, and thus regulators have moved at different speeds in different parts of the country on this urging.

Although we’ve entered a new era, utilities remain burdened with the internal problems caused by the “silo mentality” left over from generations of tight regulatory control. Today, real-time data is often still jealously guarded in engineering and operations silos. However, a key component in the development of intelligent utilities will be pushing both real-time and back-office data onto dashboards so that executives can make real-time decisions.

Getting from where utilities were (and in many respects still are) in the last century to where they need to be by 2018 isn’t a problem that can be solved overnight. And, in fact, utilities have historically evolved slowly. Today’s executives know that technological evolution in the utility industry needs to accelerate rapidly, but they’re uncertain where to start. For example, should you install an advanced metering structure (AMI) as rapidly as possible? Do you emphasize automating the grid and adding artificial intelligence? Do you continue to build out mobile systems to push data (and more detailed, simpler instructions) to field crews who soon will be much younger and less experienced? Do you rush into home automation? Do you build windmills and solar farms? Utilities have neither the financial nor human resources to do everything at once.

THE DEMAND FOR AMI

Its name implies that a smart grid will become increasingly self-operating and self-healing – and indeed much of the technology for this type of intelligent network grid has been developed. It has not, however, been widely deployed. Utilities, in fact, have been working on basic distribution automation (DA) – the capability to operate the grid remotely – for a number of years.

As mentioned earlier, most theorists – not to mention politicians and regulators – feel that utilities will have to enable AMI and demand response/home automation if they’re to encourage energy conservation in an impending era of short supplies. While advanced meter reading (AMR) has been around for a long time, its penetration remains relatively small in the utilities industry – especially in the case of advanced AMI meters for enabling demand response: According to figures released by Sierra Energy Group and Newton-Evans Research Co., only 8 to 10 percent of this country’s utilities were using AMI meters by 2008.

That said, the push for AMI on the part of both EPACT 2005 and regulators is having an obvious effect. Numerous utilities (including companies like Entergy and Southern Co.) that previously refused to consider AMR now have AMI projects in progress. However, even though an anticipated building boom in AMI is finally underway, there’s still much to be done to enable the demand response that will be desperately needed by 2016.

THE AUTOMATED HOME

The final area we can expect the IUE/SG concept to envelope comes at the residential level. With residential home automation in place, utilities will be able to control usage directly – by adjusting thermostats or compressor cycling, or via other techniques. Again, the technology for this has existed for some time; however, there are very few installations nationwide. A number of experiments were conducted with home automation in the early- to mid-1990s, with some subdivisions even being built under the mantra of “demand-side management.”

Demand response – the term currently in vogue with politicians – may be considered more politically correct, but the net result is the same. Home automation will enable regulators, through utilities, to ration usage. Although politicians avoid using the word rationing, if global warming concerns continue to seriously impact utilities’ ability to access adequate generation, rationing will be the result – making direct load control at the residential level one of the most problematic issues in the distributed utility paradigm of the future. Are large numbers of Americans going to acquiesce calmly to their electrical supply being rationed? No one knows, but there seem to be few options.

GREEN PRESSURE AND THE TIPPING POINT

While much legitimate scientific debate remains about whether global warming is real and, if so, whether it’s a naturally occurring or man-made phenomenon (arising primarily from carbon dioxide emissions), that debate is diminishing among politicians at every level. The majority of politicians, in fact, have bought into the notion that carbon emissions from many sources – primarily the generation of electricity by burning coal – are the culprit.

Thus, despite continued scientific debate, the political tipping point has been reached, and U.S. politicians are making moves to force this country’s utility industry to adapt to a situation that may or may not be real. Whether or not it makes logical or economic sense, utilities are under increasing pressure to adopt the Intelligent Utility/Smart Grid/Home Automation/Demand Response model – a model that includes many small generation points to make up for fewer large plants. This political tipping point is also shutting down more proposed generation projects each month, adding to the likely shortage. Since 2000, approximately 50 percent of all proposed new coal-fired generation plants have been canceled, according to energy-industry adviser Wood McKenzie (Gas and Power Service Insight, February 2008).

In the distant future, as technology continues to advance, electric generation in the United States will likely include a mix of energy sources, many of them distributed and green. however, there’s no way that in the next 10 years – the window of greatest concern in the NERC projections on the generation and reliability side – green energy will be ready and available in sufficient quantities to forestall a significant electricity shortfall. Nuclear energy represents the only truly viable solution; however, ongoing opposition to this form of power generation makes it unlikely that sufficient nuclear energy will be available within this period. The already-lengthy licensing process (though streamlined somewhat of late by the Nuclear Regulatory Commission) is exacerbated by lawsuits and opposition every step of the way. In addition, most of the necessary engineering and manufacturing processes have been lost in the United States over the last 30 years – the time elapsed since the last U.S. nuclear last plant was built – making it necessary to reacquire that knowledge from abroad.

The NERC Reliability Report of Oct. 15, 2007, points strongly toward a significant shortfall of electricity within approximately 10 years – a situation that could lead to rolling blackouts and brownouts in parts of the country that have never experienced them before. It could also lead to mandatory “demand response” – in other words, rationing – at the residential level. This situation, however, is not inevitable: technology exists to prevent it (including nuclear and cleaner coal now as well as a gradual development of solar, biomass, sequestration and so on over time, with wind for peaking). But thanks to concern over global warming and other issues raised by the environmental community, many politicians and regulators have become convinced otherwise. And thus, they won’t consider a different tack to solving the problem until there’s a public outcry – and that’s not likely to occur for another 10 years, at which point the national economy and utilities may already have suffered tremendous (possibly irreparable) harm.

WHAT CAN BE DONE?

The problem the utilities industry faces today is neither economic nor technological – it’s ideological. The global warming alarmists are shutting down coal before sufficient economically viable replacements (with the possible exception of nuclear) are in place. And the rest of the options are tied up in court. (For example, the United States needs 45 liquefied natural gas plants to be converted to gas – a costly fuel with iffy reliability – but only five have been built; the rest are tied up in court.) As long as it’s possible to tie up nuclear applications for five to 10 years and shut down “clean coal” plants through the political process, the U.S. utility industry is left with few options.

So what are utilities to do? They must get much smarter (IUE/Sg), and they must prepare for rationing (AMI/demand response). As seen in SEG studies, utilities still have a ways to go in these areas, but at least this is a strategy that can (for the most part) be put in place within 10 to 15 years. The technology for IUE/Sg already exists; it’s relatively inexpensive (compared with large-scale green energy development and nuclear plant construction); and utilities can employ it with relatively little regulatory oversight. In fact, regulators are actually encouraging it.

For these reasons, IUE/SG represents a major bridge to a more stable future. Even if today’s apocalyptic scenarios fail to develop – that is, global warming is debunked, or new generation sources develop much more rapidly than expected – intelligent utilities with smart grids will remain a good idea. The paradigm is shifting as we watch – but will that shift be completed in time to prevent major economic and social dislocation? Fasten your seatbelts: the next 10 to 15 years should be very interesting!

Advanced Metering Infrastructure: The Case for Transformation

Although the most basic operational benefits of an advanced metering infrastructure (AMI) initiative can be achieved by simply implementing standard technological features and revamping existing processes, this approach fails to leverage the full potential of AMI to redefine the customer experience and transform the utility operating model. In addition to the obvious operational benefits – including a significant reduction in field personnel and a decrease in peak load on the system – AMI solutions have the potential to achieve broader strategic, environmental and regulatory benefits by redefining the utility-customer relationship. To capture these broader benefits, however, utilities must view AMI as a transformation initiative, not simply a technology implementation project. Utilities must couple their AMI implementations with a broader operational overhaul and take a structured approach to applying the operating capabilities required to take advantage of AMI’s vast opportunities. One key step in this structured approach to transformation is enterprise-wide business process design.

WHY “AS IS” PROCESSES WON’T WORK FOR AMI

Due to the antiquated and fragmented nature of utility processes and systems, adapting “as is” processes alone will not be sufficient to realize the full range of AMI benefits. Multiple decades of industry consolidation have resulted in utilities with diverse business processes reflecting multiple legacy company operating practices. Associated with these diverse business processes is a redundant set of largely homegrown applications resulting in operational inefficiencies that may impact customer service and reliability, and prevent utilities from adapting to new strategic initiatives (such as AMI) as they emerge.

For example, in the as-is environment, utilities are often slow to react to changes in customer preferences and require multiple functional areas to respond to a simple customer request. A request by a customer to enroll in a new program, for example, will involve at least three organizations within the utility: the call center initially handles the customer request; the field services group manages changing or reprogramming the customer’s meter to support the new program; and the billing group processes the request to ensure that the customer is correctly enrolled in the program and is billed accordingly. In most cases, a simple request like this can result in long delays to the customer due to disjointed processes with multiple hand-off points.

WHY USE AMI AS THE CATALYST FOR OPERATIONAL TRANSFORMATION?

The revolutionary nature of AMI technology and its potential for application to multiple areas of the utility makes an AMI implementation the perfect opportunity to adapt the utility operating structure. To use AMI as a platform for operational transformation, utilities must shift their thought paradigm from functionally based to enterprise-wide, process-centric environments. This approach will ensure that utilities take full advantage of AMI’s technological capabilities without being constrained by existing processes and organizational structures.

If the utility is to offer new programs and services as well as respond to shifting external demands, it must anticipate and respond quickly to changes in behaviors. Rapid information dissemination and quick response to changes in business, environmental and economic situations are essential for utilities that wish to encourage customers to think of energy in a new way and proactively manage their usage through participation in time-of-use and real-time demand response programs. This transition requires that system and organizational hand-offs be integrated to create a seamless and flexible work flow. Without this integration, utilities cannot proactively and quickly adapt processes to satisfy ever-increasing customer expectations. In essence, AMI fails if “smart meters” and “smart systems” are implemented without “smart processes” to support them.

DESIGNING SMART PROCESSES

Designing smart future state business processes to support transformational initiatives such as AMI involves more than just rearranging existing works flows. Instead, a utility must adopt a comprehensive approach to business process design – one that engages stakeholders throughout the organization and that enables them to design processes from the ground up. The utility must also design flexible processes that can adapt to changing customer, technology, business and regulatory expectations while avoiding the pitfalls of the current organization and process structure. As part of a utility’s business process design effort, it must also redefine jobs more broadly, increase training to support those jobs, enable decision making by front-line personnel and redirect rewards systems to focus on processes as well as outcomes. Utilities must also reshape organizational cultures to emphasize teamwork, personnel accountability and the customer’s importance; to redefine roles and responsibilities so that managers oversee processes instead of activities and develop people rather then supervise them; and to realign information system so that they help cross-functional processes work smoothly rather than simply support individual functional areas.

BUSINESS PROCESS DESIGN FRAMEWORK

IBM’s enterprise-wide business process design framework provides a structured approach to the development of the future state processes that support operational transformations and the complexities of AMI initiatives. This framework empowers utilities to apply business process design as the cornerstone of a broader effort to transition to a customer-centric organization capable of engaging external stakeholders. In addition, this framework also supports corporate decision making and continuous improvement by emphasizing real-time metrics and measurement of operational procedures. The framework is made up of the following five phases (Figure 1):

Phase 1 – As-is functional assessment. During this phase, utilities assess their current state processes and supporting organizations and systems. The goal of this phase is to identify gaps, overlaps and conflicts with existing processes and to identify opportunities to leverage the AMI technology. This assessment requires utility stakeholders to dissect existing process throughout the organization and identify instances where the utility is unable to fully meet customer, environmental and regulatory demands. The final step in this phase is to define a set of “future state” goals to guide process development. These goals must address all of the relevant opportunities to both improve existing processes and perform new functions and services.

Phase 2 – Future state process analysis. During this phase, utilities design end-to-end processes that meet the future state goals defined in Phase 1. To complete this effort, utilities must synthesize components from multiple functional areas and think outside the current organizational hierarchy. This phase requires engagement from participants throughout the utility organization, and participants should be encouraged to envision all relevant opportunities for using AMI to improve the utility’s relationship with customers, regulators and the environment. At the conclusion of this phase, all processes should be assessed in terms of their ability to alleviate the current state issues and to meet the future state goals defined in Phase 1.

Phase 3 – Impact identification. During this phase, utilities identify the organizational structure and corporate initiatives necessary to “operationalize” the future state processes. Key questions answered during this phase include how will utilities transition from current to future state? How will each functional area absorb the necessary changes? And what are the new organizations, roles and skills needed? This phase requires the utility to think outside of the current organizational structure to identify the optimal way to support the processes designed in Phase 2. During the impact identification phase of business, it’s crucial that process be positioned as the dominant organizational axis. Because process-organized utilities are not bound to a conventional hierarchy or fixed organizational structure, they can be customer-centric, make flexible use of their resources and respond rapidly to new business situations.

Phase 4 – Socialization. During this phase, utilities focus on obtaining ownership and buy-in from the impacted organizations and broader group of internal and external stakeholders. This phase often involves piloting the new processes and technology in a test environment and reaching out to a small set of customers to solicit feedback. This phase is also marked by the transition of the products from the first three phases of the business process design effort to the teams affected by the new processes – namely the impacted business areas as well as the organizational change management and information technology teams.

Phase 5 – Implementation and measurement. During the final phase of the business process design framework, the utility transitions from planning and design to implementation. The first step of this phase is to define the metrics and key performance indicators (KPIs) that will be used to measure the success of the new processes – necessary if organizations and managers are to be held responsible for the new processes, and for guiding continuous refinement and improvement. After these metrics have been established, the new organizational structure is put in place and the new processes are introduced to this structure.

BENEFITS AND CHALLENGES OF BUSINESS PROCESS DESIGN

The business process design framework outlined above facilitates the permeation of the utility goals and objectives throughout the entire organization. This effort does not succeed, though, without significant participation from internal stakeholders and strong sponsorship from key executives.

The benefits of this approach include the following:

  • It facilitates ownership. Because the management team is engaged at the beginning of the AMI transformation, managers are encouraged to own future state processes from initial design through implementation.
  • It identifies key issues. A comprehensive business design effort allows for earlier visibility into key integration issues and provides ample time to resolve them prior to rolling out the technologies to the field.
  • It promotes additional capabilities. The business process framework enables the utility to develop innovative ways to apply the AMI technology and ensures that future state processes are aligned to business outcomes.
  • It puts the focus on customers. A thorough business process effort ensures that the necessary processes and functional groups are put in place to empower and inform the utility customer.

The challenges of this approach include the following:

  • It entails a complex transition. The utility must manage the complexities and ambiguities of shifting from functional-based operations to process-based management and decision making.
  • It can lead to high expectations. The utility must also manage stakeholder expectations and be clear that change will be slow and painful. Revolutionary change is made through evolutionary steps – meaning that utilities cannot expect to take very large steps at any point in the process.
  • There may be technological limitations. Throughout the business process design effort, utilities will identify new ways to improve customer satisfaction through the use of AMI technology. The standard technology, however, may not always support these visions; thus, utilities must be prepared to work with vendors to support the new processes.

Although execution of future state business process design undoubtedly requires a high degree of effort, a successful operational transformation is necessary to truly leverage the features of AMI technology. If utilities expect to achieve broad-reaching benefits, they must put in place the operational and organization structures to support the transformational initiatives. Utilities cannot afford to think of AMI as a standard technology implementation or to jump immediately to the definition of system and technology requirements. This approach will inevitably limit the impact of AMI solutions and leave utilities implementing cutting-edge technology with fragmented processes and inflexible, functionally based organizational structures.