The Technology Demonstration Center

When a utility undergoes a major transformation – such as adopting new technologies like advanced metering – the costs and time involved require that the changes are accepted and adopted by each of the three major stakeholder groups: regulators, customers and the utility’s own employees. A technology demonstration center serves as an important tool for promoting acceptance and adoption of new technologies by displaying tangible examples and demonstrating the future customer experience. IBM has developed the technology center development framework as a methodology to efficiently define the strategy and tactics required to develop a technology center that will elicit the desired responses from those key stakeholders.

KEY STAKEHOLDER BUY-IN

To successfully implement major technology change, utilities need to consider the needs of the three major stakeholders: regulators, customers and employees.

Regulators. Utility regulators are naturally wary of any transformation that affects their constituents on a grand scale, and thus their concerns must be addressed to encourage regulatory approval. The technology center serves two purposes in this regard: educating the regulators and showing them that the utility is committed to educating its customers on how to receive the maximum benefits from these technologies.

Given the size of a transformation project, it’s critical that regulators support the increased spending required and any consequent increase in rates. Many regulators, even those who favor new technologies, believe that the utility will benefit the most and should thus cover the cost. If utilities expect cost recovery, the regulators need to understand the complexity of new technologies and the costs of the interrelated systems required to manage these technologies. An exhibit in the technology center can go “behind the curtain,” giving regulators a clearer view of these systems, their complexity and the overall cost of delivering them.

Finally, each stage in the deployment of new technologies requires a new approval process and provides opportunities for resistance from regulators. For the utility, staying engaged with regulators throughout the process is imperative, and the technology center provides an ideal way to continue the conversation.

Customers. Once regulators give their approval, the utility must still make its case to the public. The success of a new technology project rests on customers’ adoption of the technology. For example, if customers continue using appliances as they always did, at a regular pace throughout the day and not adjusting for off-peak pricing, the utility will fail to achieve the major planned cost advantage: a reduction in production facilities. Wide-scale customer adoption is therefore key. Indeed, general estimates indicate that customer adoption rates of roughly 20 percent are needed to break even in a critical peak-pricing model. [1]

Given the complexity of these technologies, it’s quite possible that customers will fail to see the value of the program – particularly in the context of the changes in energy use they will need to undertake. A well-designed campaign that demonstrates the benefits of tiered pricing will go a long way toward encouraging adoption. By showcasing the future customer experience, the technology center can provide a tangible example that serves to create buzz, get customers excited and educate them about benefits.

Employees. Obtaining employee buy-in on new programs is as important as winning over the other two stakeholder groups. For transformation to be successful, an understanding of the process must be moved out of the boardroom and communicated to the entire company. Employees whose responsibilities will change need to know how they will change, how their interactions with the customer will change and what benefits are in it for them. At the same time, utility employees are also customers. They talk to friends and spread the message. They can be the utility’s best advocates or its greatest detractors. Proper internal communication is essential for a smooth transition from the old ways to the new, and the technology center can and should be used to educate employees on the transformation.

OTHER GOALS FOR THE TECHNOLOGY DEMONSTRATION CENTER

The objectives discussed above represent one possible set of goals for a technology center. Utilities may well have other reasons for erecting the technology center, and these should be addressed as well. As an example, the utility may want to present a tangible display of its plans for the future to its investors, letting them know what’s in store for the company. Likewise, the utility may want to be a leader in its industry or region, and the technology center provides a way to demonstrate that to its peer companies. The utility may also want to be recognized as a trendsetter in environmental progress, and a technology center can help people understand the changes the company is making.

The technology center needs to be designed with the utility’s particular environment in mind. The technology center development framework is, in essence, a road map created to aid the utility in prioritizing the technology center’s key strategic priorities and components to maximize its impact on the intended audience.

DEVELOPING THE TECHNOLOGY CENTER

Unlike other aspects of a traditional utility, the technology center needs to appeal to customers visually, as well as explain the significance and impact of new technologies. The technology center development framework presented here was developed by leveraging trends and experiences in retail, including “experiential” retail environments such as the Apple Stores in malls across the United States. These new retail environments offer a much richer and more interactive experience than traditional retail outlets, which may employ some basic merchandising and simply offer products for sale.

Experiential environments have arisen partly as a response to competition from online retailers and the increased complexity of products. The Technology Center Development Framework uses the same state-of-the-art design strategies that we see adopted by high-end retailers, inspiring the executives and leadership of the utility to create a compelling experience that will enable the utility to elicit the desired response and buy-in from the stakeholders described above.

Phase 1: Technology Center Strategy

During this phase, a utility typically spends four to eight weeks developing an optimal strategy for the technology center. To accomplish this, planners identify and delineate in detail three major elements:

  • The technology center’s goals;
  • Its target audience; and
  • Content required to achieve those goals.

As shown in Figure 1, these pieces are not mutually exclusive; in fact, they’re more likely to be iterative: The technology center’s goals set the stage for determining the audience and content, and those two elements influence each other. The outcome of this phase is a complete strategy road map that defines the direction the technology center will take.

To understand the Phase 1 objectives properly, it’s necessary to examine the logic behind them. The methodology focuses on the three elements mentioned previously – goals, audience and content – because these are easily overlooked and misaligned by organizations.

Utility companies inevitably face multiple and competing goals. Thus, it’s critical to identify the goals specifically associated with the technology center and to distinguish them from other corporate goals or goals associated with implementing a new technology. Taking this step forces the organization to define which goals can be met by the technology center with the greatest efficiency, and establishes a clear plan that can be used as a guide in resolving the inevitable future conflicts.

Similarly, the stakeholders served by the utility represent distinct audiences. Based on the goals of the center and the organization, as well as the internal expectations set by managers, the target audience needs to be well defined. Many important facets of the technology center, such as content and location, will be partly determined by the target audience. Finally, the right content is critical to success. A regulator may want to see different information than customers.

In addition, the audience’s specific needs dictate different content options. Do the utility’s customers care about the environment? Do they care more about advances in technology? Are they concerned about how their lives will change in the future? These questions need to be answered early in the process.

The key to successfully completing Phase 1 is constant engagement with the utility’s decision makers, since their expectations for the technology center will vary greatly depending on their responsibilities. Throughout this phase, the technology center’s planners need to meet with these decision makers on a regular basis, gather and respect their opinions, and come to the optimal mix for the utility on the whole. This can be done through interviews or a series of workshops, whichever is better suited for the utility. We have found that by employing this process, an organization can develop a framework of goals, audience and content mix that everyone will agree on – despite differing expectations.

Phase 2: Design Characteristics

The second phase of the development framework focuses on the high-level physical layout of the technology center. These “design characteristics” will affect the overall layout and presentation of the technology center.

We have identified six key characteristics that need to be determined. Each is developed as a trade-off between two extremes; this helps utilities understand the issues involved and debate the solutions. Again, there are no right answers to these issues – the optimal solution depends on the utility’s environment and expectations:

  • Small versus large. The technology center can be small, like a cell phone store, or large, like a Best Buy.
  • Guided versus self-guided. The center can be designed to allow visitors to guide themselves, or staff can be retained to guide visitors through the facility.
  • Single versus multiple. There may be a single site, or multiple sites. As with the first issue (small versus large), one site may be a large flagship facility, while the others represent smaller satellite sites.
  • Independent versus linked. Depending on the nature of the exhibits, technology center sites may operate independently of each other or include exhibits that are remotely linked in order to display certain advanced technologies.
  • Fixed versus mobile. The technology center can be in a fixed physical location, but it can also be mounted on a truck bed to bring the center to audiences around the region.
  • Static versus dynamic. The exhibits in the technology center may become outdated. How easy will it be to change or swap them out?

Figure 2 illustrates a sample set of design characteristics for one technology center, using a sample design characteristic map. This map shows each of the characteristics laid out around the hexagon, with the preference ranges represented at each vertex. By mapping out the utility’s options with regard to the design characteristics, it’s possible to visualize the trade-offs inherent in these decisions, and thus identify the optimal design for a given environment. In addition, this type of map facilitates reporting on the project to higher-level executives, who may benefit from a visual executive summary of the technology center’s plan.

The tasks in Phase 2 require the utility’s staff to be just as engaged as in the strategy phase. A workshop or interviews with staff members who understand the various needs of the utility’s region and customer base should be conducted to work out an optimal plan.

Phase 3: Execution Variables

Phases 1 and 2 provide a strategy and design for the technology center, and allow the utility’s leadership to formulate a clear vision of the project and come to agreement on the ultimate purpose of the technology center. Phase 3 involves engaging the technology developers to identify which aspects of the new technology – for example, smart appliances, demand-side management, outage management and advanced metering – will be displayed at the technology center.

During this phase, utilities should create a complete catalog of the technologies that will be demonstrated, and match them up against the strategic content mix developed in Phase 1. A ranking is then assigned to each potential new technology based on several considerations, such as how well it matches the strategy, how feasible it is to demonstrate the given technology at the center, and what costs and resources would be required. Only the most efficient and well-matched technologies and exhibits will be displayed.

During Phase 3, outside vendors are also engaged, including architects, designers, mobile operators (if necessary) and real estate agents, among others. With the first two phases providing a guide, the utility can now open discussions with these vendors and present a clear picture of what it wants. The technical requirements for each exhibit will be cataloged and recorded to ensure that any design will take all requirements into account. Finally, the budget and work plan are written and finalized.

CONCLUSION

With the planning framework completed, the team can now build the center. The framework serves as the blueprint for the center, and all relevant benchmarks must be transparent and open for everyone to see. Disagreements during the buildout phase can be referred back to the framework, and issues that don’t fit the framework are discarded. In this way, the utility can ensure that the technology center will meet its goals and serve as a valuable tool in the process of transformation.

Thank you to Ian Simpson, IBM Global Business Services, for his contributions to this paper.

ENDNOTE

  1. Critical peak pricing refers to the model whereby utilities use peak pricing only on days when demand for electricity is at its peak, such as extremely hot days in the summer.

The Virtual Generator

Electric utility companies today constantly struggle to find a balance between generating sufficient power to satisfy their customers’ dynamic load requirements and minimizing their capital and operating costs. They spend a great deal of time and effort attempting to optimize every element of their generation, transmission and distribution systems to achieve both their physical and economic goals.

In many cases, “real” generators waste valuable resources – waste that if not managed efficiently can go directly to the bottom line. Energy companies therefore find the concept of a “virtual generator,” or a virtual source of energy that can be turned on when needed, very attractive. Although generally only representing a small percentage of utilities’ overall generation capacity, virtual generators are quick to deploy, affordable, cost-effective and represent a form of “green energy” that can help utilities meet carbon emission standards.

Virtual generators use forms of dynamic voltage and capacitance (Volt/ VAr) adjustments that are controlled through sensing, analytics and automation. The overall process involves first flattening or tightening the voltage profiles by adding additional voltage regulators to the distribution system. Then, by moving the voltage profile up or down within the operational voltage bounds, utilities can achieve significant benefits (Figure 1). It’s important to understand, however, that because voltage adjustments will influence VArs, utilities must also adjust both the placement and control of capacitors (Figure 2).

Various business drivers will influence the use of Volt/VAr. A utility could, for example, use Volt/VAr to:

  • Respond to an external system-wide request for emergency load reduction;
  • Assist in reducing a utility’s internal load – both regional and throughout the entire system;
  • Target specific feeder load reduction through the distribution system;
  • Respond as a peak load relief (a virtual peaker);
  • Optimize Volt/VAr for better reliability and more resiliency;
  • Maximize the efficiency of the system and subsequently reduce energy generation or purchasing needs;
  • Achieve economic benefits, such as generating revenue by selling power on the spot market; and
  • Supply VArs to supplement off-network deficiencies.

Each of the above potential benefits falls into one of four domains: peaking relief, energy conservation, VAr management or reliability enhancement. The peaking relief and energy conservation domains deal with load reduction; VAr management, logically enough, involves management of VArs; and reliability enhancement actually increases load. In this latter domain, the utility will use increased voltage to enable greater voltage tolerances in self-healing grid scenarios or to improve the performance of non-constant power devices to remove them from the system as soon as possible and therefore improve diversity.

Volt/VAr optimization can be applied to all of these scenarios. It is intended to either optimize a utility’s distribution network’s power factor toward unity, or to purposefully make the power factor leading in anticipation of a change in load characteristics.

Each of these potential benefits comes from solving a different business problem. Because of this, at times they can even be at odds with each other. Utilities must therefore create fairly complex business rules supported by automation to resolve any conflicts that arise.

Although the concept of load reduction using Volt/VAr techniques is not new, the ability to automate the capabilities in real time and drive the solutions with various business requirements is a relatively recent phenomenon. Energy produced with a virtual generator is neither free nor unlimited. However, it is real in the sense that it allows the system to use energy more efficiently.

A number of things are driving utilities’ current interest in virtual generators, including the fact that sensors, analytics, simulation, geospatial information, business process logic and other forms of information technology are increasingly affordable and robust. In addition, lower-cost intelligent electrical devices (IEDs) make virtual generators possible and bring them within reach of most electric utility companies.

The ability to innovate an entirely new solution to support the above business scenarios is now within the realm of possibility for the electric utility company. As an added benefit, much of the base IT infrastructure required for virtual generators is the same as that required for other forms of “smart grid” solutions, such as advanced meter infrastructure (AMI), demand side management (DSM), distributed generation (DG) and enhanced fault management. Utilities that implement a well-designed virtual generator solution will ultimately be able to align it with these other power management solutions, thus optimizing all customer offerings that will help reduce load.

HOW THE SOLUTION WORKS

All utilities are required, for regulatory or reliability reasons, to stay within certain high- and low-voltage parameters for all of their customers. In the United States the American Society for Testing and Materials (ATSM) guidelines specify that the nominal voltage for a residential single-phase service should be 120 volts with a plus or minus 6-volt variance (that is, 114 to 126 volts). Other countries around the world have similar guidelines. Whatever the actual values are, all utilities are required to operate within these high- and low-voltage “envelopes.” In some cases, additional requirements may be imposed as to the amount of variance – the number of volts changed or the percent change in the voltage – that can take place over a period of minutes or hours.

Commercial customers may have different high/low values, but the principle remains the same. In fact, it is the mixture of residential, commercial and industrial customers on the same feeder that makes the virtual generation solution almost a requirement if a utility wants to optimize its voltage regulation.

Although it would be ideal for a utility to deliver 120-volt power consistently to all customers, the physical properties of the distribution system as well as dynamic customer loading factors make this difficult. Most utilities are already trying to accomplish this through planning, network and equipment adjustments, and in many cases use of automated voltage control devices. Despite these efforts, however, in most networks utilities are required to run the feeder circuit at higher-than-nominal levels at the head of the circuit in order to provide sufficient voltage for downstream users, especially those at the tails or end points of the circuit.

In a few cases, electric utilities have added manual or automatic voltage regulators to step up voltage at one or more points in a feeder circuit because of nonuniform loading and/or varied circuit impedance characteristics throughout the circuit profile. This stepped-up slope, or curve, allows the utility company to comply with the voltage level requirements for all customers on the circuit. In addition, utilities can satisfy the VAr requirements for operational efficiency of inductive loads using switched capacitor banks, but they must coordinate those capacitor banks with voltage adjustments as well as power demand. Refining voltage profiles through virtual generation usually implies a tight corresponding control of capacitance as well.

The theory behind a robust Volt/ VAr regulated feeder circuit is based on the same principles but applied in an innovative manner. Rather than just using voltage regulators to keep the voltage profile within the regulatory envelope, utilities try to “flatten” the voltage curve or slope. In reality, the overall effect is a stepped/slope profile due to economic limitations on the number of voltage regulators applied per circuit. This flattening has the effect of allowing an overall reduction, or decrease, in nominal voltage. In turn the operator may choose to move the voltage curve up or down within the regulatory voltage envelope. Utilities can derive extra benefit from this solution because all customers within a given section of a feeder circuit could be provided with the same voltage level, which should result in less “problem” customers who may not be in the ideal place on the circuit. It could also minimize the possible power wastage of overdriving the voltage at the head of the feeder in order to satisfy customers at the tails.

THE ROLE OF AUTOMATION IN DELIVERING THE VIRTUAL GENERATOR

Although theoretically simple in concept, executing and maintaining a virtual generator solution is a complex task that requires real-time coordination of many assets and business rules. Electrical distribution networks are dynamic systems with constantly changing demands, parameters and influencers. Without automation, utilities would find it impossible to deliver and support virtual generators, because it’s infeasible to expect a human – or even a number of humans – to operate such systems affordably and reliably. Therefore, utilities must leverage automation to put humans in monitoring rather than controlling roles.

There are many “inputs” to an automated solution that supports a virtual generator. These include both dynamic and static information sources. For example, real-time sensor data monitoring the condition of the networks must be merged with geospatial information, weather data, spot energy pricing and historical data in a moment-by-moment, repeating cycle to optimize the business benefits of the virtual generator. Complicating this, in many cases the team managing the virtual generator will not “own” all of the inputs required to feed the automated system. Frequently, they must share this data with other applications and organizational stakeholders. It’s therefore critical that utilities put into place an open, collaborative and integrated technology infrastructure that supports multiple applications from different parts of the business.

One of the most critical aspects of automating a virtual generator is having the right analytical capabilities to decide where and how the virtual generator solution should be applied to support the organizations’ overall business objectives. For example, utilities should use load predictors and state estimators to determine future states of the network based on load projections given the various Volt/VAr scenarios they’re considering. Additionally, they should use advanced analytic analyses to determine the resiliency of the network or the probability of internal or external events influencing the virtual generator’s application requirements. Still other types of analyses can provide utilities with a current view of the state of the virtual generator and how much energy it’s returning to the system.

While it is important that all these techniques be used in developing a comprehensive load-management strategy, they must be unified into an actionable, business-driven solution. The business solution must incorporate the values achieved by the virtual generator solutions, their availability, and the ability to coordinate all of them at all times. A voltage management solution that is already being used to support customer load requirements throughout the peak day will be of little use to the utility for load management. It becomes imperative that the utility understand the effect of all the voltage management solutions when they are needed to support the energy demands on the system.

Tomorrow’s Bill Payment Solutions for Today’s Businesses

Providing consumers with innovative services for more than 150 years, Western Union is an established leader in electronic and cash bill-payment solutions. We introduced our first consumer-to-consumer money transfer service in 1871 and began offering consumer-to-business bill payment services in 1989 with the introduction of the Western Union Quick Collect® service, providing consumers in the United States with convenient walk-in agent network locations where they can pay bills in cash.

In 2008, our comprehensive suite of services has grown to include Speedpay® – an electronic bill payment option that provides businesses with Internet, IVR, desktop, mobile payments, online banking and call center solutions, as well as e-bill presentment with payments and interactive outbound messaging integrated with payment processing.

THE CONSUMER-TO-BUSINESS SEGMENT

Western Union’s electronic and cash bill payment services provide consumers with fast, convenient ways to send one-time or recurring payments to a broad spectrum of industries. At Western Union we have relationships with more than 6,000 businesses and organizations that receive consumer payments, including utilities, auto finance companies, mortgage servicers, financial service providers and government agencies. These relationships form a core component of our consumer-to-business payment service and are one reason we were able to process 404 million consumer-to-business transactions in 2007.

PORTFOLIO OF SERVICES

Our consumer-to-business services give consumers choices in payment type and method, and include the following options:

  • Electronic payments. Consumers and billers use our Speedpay® service in the United States and the United Kingdom to make consumer payments to a variety of billers using credit cards, ATM cards and debit cards, and via ACH withdrawal. Payments are initiated through multiple channels, including biller-hosted websites, westernunion.com, IVR units, Online Banking websites and call centers.
  • Cash payments. Consumers use our Quick Collect® or Prepaid® services to send guaranteed funds to businesses and government agencies using cash (and in select locations, debit cards). Quick Collect is available at nearly 60,000 Western Union agent locations across the United States and Canada, while our Prepaid service can be accessed at more than 40,000 U.S. locations. Consumers can also use our Convenience Pay® service to send payments by cash or check from a smaller number of agent locations primarily to utilities and telecommunication providers.

DISTRIBUTION AND MARKETING CHANNELS

Our electronic payment services are available primarily through an IVR, over the Internet and via Call Center using a desktop application while speaking with a biller’s customer service representative. Through our Quick Pay® service, it is possible to receive payments sent from outside the United States or Canada from over 320,000 agent locations in more than 200 countries and territories around the world. We work in partnership with our billers to market our services to consumers in a number of ways, including direct mail, Email, Internet and point-of-sale advertising.

ONLINE BANKING

In late 2007, Western Union launched its Online Banking initiative, helping to change the way consumers pay their bills. The channel accelerates the speed with which billers receive payment from two to four days to a next-day or same-day delivery, and enables Western Union Payment Services to process bill payments initiated by consumers from their banks’ online banking sites.

Western Union plans to work with the nation’s largest banks to provide your customers with a new class of online banking payment that allows them to make same- and next-day payments that are posted and funded to you faster and are of a higher quality than other online banking payments currently available.

EMAIL BILL PRESENTMENT AND PAYMENT

While the benefits of electronic bill presentment and payment are compelling for both billers and consumers, low consumer adoption rates have prevented billers from fully realizing the cost savings and improved customer service levels these services promote. Western Union® Payment Services aims to change this through its integration with Striata® Email bill presentment and payment (EBPP) solutions.

With this integrated, encrypted Email bill presentment and one-click payment service, consumers no longer need to register to receive their bill electronically, visit a separate website to download the bill and send a payment, or remember multiple user names and passwords. By removing these extra steps from the process, these services become dramatically easier to use for consumers.

The critical differentiator of the Western Union/Striata service is that the entire e-bill is delivered directly into the consumer’s in-box as an encrypted off-line attachment, enabling payment to be sent through the e-bill itself using the Western Union® Speedpay service. While complementary to existing online presentment solutions, this “push” Email billing offering can be more successful at driving adoption.

Bill Pay and Presentment Solutions for Utility Companies

Recognizing that not all customers view and pay bills in the same way, Check- Free helps you deliver a complete range of billing and payment options – from the traditional methods of receiving and paying bills by mail, in person and over the phone to complete paperless online billing and payment using either a bank or your website. CheckFree offers solutions that help you meet market demands.

Whether you need to improve a single solution or your entire offering, CheckFree can offer experience and expertise in the following payment channels:

  • By Mail. Some people still choose to receive paper bills and write checks. CheckFree can help turn these paper checks into ACH electronic debits, speeding payment collections.
  • In Person. Give your customers in-person payment convenience and choice to use cash, checks, money orders or merchant-issued certificates.
  • By Phone. Enable your customers to pay a bill anywhere they have access to a phone, all day, every day. With the recent acquisition of CheckFree by Fiserv, you can look for Fiserv’s industry-leading BillMatrix platform to be integrated into our suite of offerings.
  • Online. Deliver bill paying ease and convenience through CheckFree’s full range of electronic billing and payment (EBP) solutions at your site and beyond your site.
  • Emergency Payments. Offer a fee-based option for last-minute online payments and eliminate expenses due to delinquent payments.
  • Electronic Remittance. Provide quicker access to payment funds while reducing the cost of processing paper checks.

CUSTOMER INTERACTION OPTIMIZATION

CheckFree solutions enable you to optimize each customer interaction by offering multiple payment channel options that focus on security, reliability, functionality and convenience. Each interaction with the consumer represents an ideal opportunity to enhance the customer experience and build loyal customers.

Our Customer Interaction Optimization solutions make interactions a win/win for both you and your customers. You deliver the payment channels they seek while maintaining the ability to guide them to the most profitable channel for your organization. The ultimate business objective is to steer customers to the lower cost-to-serve billing and payment option: the online channel.

CheckFree understands your company’s strategic need to direct consumers to the optimal online channel to enhance revenue growth through reductions in operating costs. By investing in substantial consumer behavior, segmentation and marketing research, CheckFree can assist with creating marketing campaigns focused on promoting your online channel. Every bill received, payment made or visit to your website can be utilized to strategically drive adoption of online bill pay, e-bills and paper shut-off.

For more than 25 years, CheckFree has been a leading provider of electronic billing and payment services. We process more than one billion electronic payments each year. With CheckFree’s Customer Interaction Optimization solutions, you can enhance your payment offerings while improving your bottom line.

About Alcatel-Lucent

Alcatel-Lucent’s vision is to enrich people’s lives by transforming the way the world communicates. Alcatel-Lucent provides solutions that enable service providers, enterprises and governments worldwide to deliver voice, data and video communication services to end users. As a leader in carrier and enterprise IP technologies; fixed, mobile and converged broadband access; applications and services, Alcatel-Lucent offers the end-to-end solutions that enable compelling communications services for people at work, at home and on the move.

With 77,000 employees and operations in more than 130 countries, Alcatel-Lucent is a local partner with global reach. The company has the most experienced global services team in the industry and includes Bell labs, one of the largest research, technology and innovation organizations focused on communications. Alcatel-Lucent achieved adjusted revenues of €17.8 billion in 2007, and is incorporated in France, with executive offices located in Paris.

YOUR ENERGY AND UTILITY PARTNER

Alcatel-Lucent offers comprehensive capabilities that combine carrier-grade communications technology and expertise with utility industry- specific knowledge. Alcatel-Lucent’s IP transformation expertise and utility market-specific knowledge have led to the development of turnkey communications solutions designed for the energy and utility market. Alcatel-Lucent has extensive experience in:

  • Transforming and renewing network technologies;
  • designing and implementing SmartGrid initiatives;
  • Meeting NERC CIP compliance and security requirements;
  • Working in live power generation, transmission and distribution environments;
  • Implementing and managing complex mission-critical communications projects;
  • developing best-in-class partnerships with organizations like CURRENT Communications, Ambient, BelAir networks, Alvarion and others in the utility industry.

Working with Alcatel-Lucent enables energy and utility companies to realize the increased reliability and greater efficiency of next-generation communications technology, providing a platform for – and minimizing the risks associated with – moving to SmartGrid solutions. And Alcatel-Lucent helps energy and utility companies achieve compliance with regulatory requirements and reduce operational expenses while maintaining the security, integrity and high availability of their power infrastructure and services.

ALCATEL-LUCENT IP MPLS SOLUTION FOR THE NEXT-GENERATION UTILITY NETWORK

Utility companies are experienced at building and operating reliable and effective networks to ensure the delivery of essential information and maintain fl awless service delivery. The Alcatel-Lucent IP/MPLS solution can enable utility operators to extend and enhance their networks with new technologies like IP, Ethernet and MPLS. These new technologies will enable the utility to optimize its network to reduce both capital expenditures and operating expenses without jeopardizing reliability. Advanced technologies also allow the introduction of new applications that can improve operational and workflow efficiency within the utility. Alcatel-Lucent leverages cutting-edge technologies along with the company’s broad and deep experience in the utility industry to help utility operators build better, next-generation networks with IP/MPLS.

THE ALCATEL-LUCENT ADVANTAGE

Alcatel-Lucent has years of experience in the development of IP, MPLS and Ethernet technologies. The Alcatel-Lucent IP/MPLS solution offers utility operators the flexibility, scale and feature sets required for mission-critical operation. With the broadest portfolio of products and services in the telecommunications industry, Alcatel-Lucent has the unparalleled ability to design and deliver end-to-end solutions that drive next-generation communications networks.

Delivering the Tools for Creating the Next-Generation Electrical SmartGrid

PowerSense delivers cutting-edge monitoring and control equipment together with integrated supervision to enable the modern electrical utility to prepare its existing power infrastructure for tomorrow’s SmartGrid.

PowerSense uses world-leading technology to merge existing and new power infrastructures into the existing SCADA and IT systems of the electrical utilities. This integration of the upgraded power infrastructure and existing IT systems instantly optimizes outage and fault management, thereby decreasing customer minutes lost (the System Average Interruption duration Index, or SAIDI).

At the same time, this integration helps the electrical utility further improve asset management (resulting in major cost savings) and power management (resulting in high-performance outage management and a high power efficiency). The PowerSense product line is called DISCOS® (for distribution networks, Integrated Supervision and Control System).

Discos®

The following outlines the business and system values offered by the DISCOS® product line.

Business Values

  • Cutting-edge optical technology (the sensor)
  • Easily and safely retrofitted (sensors can be fitted into all transformer types)
  • End-to-end solutions (from sensors to laptop)
  • Installation in steps (implementation based on cost-benefit analysis) system Values
  • Current (for each phase)
  • Voltage (for each phase)
  • Frequency
  • Power active, reactive and direction
  • Distance-to-fault measurement
  • Control of breakers and service relays
  • Analog inputs
  • Measurement of harmonic content for I and V
  • Measurement of earth fault

These parameters are available for both medium- and low-voltage power lines.

OPTICAL SENSOR TECHNOLOGY

With its stability and linearity, PowerSense’s cutting-edge sensor technology is setting new standards for current measurements in general. For PowerSense’s primary business area of MV grid monitoring in particular, it is creating a completely new set of standards for how to monitor the MV power grid.

The DISCOS® Current Sensor is part of the DISCOS® Opti module. The DISCOS® Sensor monitors the current size and angle on both the LV and MV side of the transformer.

BASED ON THE FARADAY EFFECT

Today, only a few applications in measuring instruments are based on the Faraday rotation principle. For instance, the Faraday effect has been used for measuring optical rotary power, for amplitude modulation of light and for remote sensing of magnetic fields.

now, due to advanced computing techniques, PowerSense is able to offer a low-priced optical sensor based on the Faraday effect.

THE COMPANY

PowerSense A/S was established on September 1, 2006, by DONG Energy A/S (formerly Nesa A/S) as a spin-off of the DISCOS® product line business. The purpose of the spin-off was to ensure the best future business conditions for the DISCOS® product line.

After the spin-off, BankInvest A/S, a Danish investment bank, holds 70 percent of the share capital. DONG Energy A/S continues to hold 30 percent of the share capital.

Customer Service in the Brave New World of Today’s Utilities

A NEW GENERATION OF CUSTOMER

Today’s utility customers are energy dependant, information driven, technologically advanced, willing to change and environmentally friendly. Their grandparents prompted utilities to develop and offer levelized billing, and their parents created the need for online bill presentment and credit card payment. This new generation of customer is about to usher in a brave new world of utility customer service in which the real-time utility will conduct business 24 hours a day, seven days a week, 365 days a year, and Internet-savvy consumers will have all the capabilities of the current customer service representative. They’ll be able to receive pricing signals and control their utility usage via Internet portals, as well as shop among utilities for the best price and switch providers.

Expectations of system reliability are high today. Ten years ago, when the customer called to let you know their power was out, the call took 20 seconds; today, they expect you to already know that their power is out and be able to provide additional information about the nature and duration of that outage. What’s wrong? Are crews on the way? What’s the ETR? Can you text me when it’s back on? The call that includes these questions (and more) takes three times as long as that phone call 10 years ago. Thankfully, utility technology is coming of age just in time to meet the needs of evolving utility customers.

Many utilities already use automated circuit switchers to monitor lines for potential fault conditions and to react in real time to isolate faults and restore power. Automated metering systems send out “last gasp” outage notifications to outage management systems to predict the location of a problem for quicker restoration of service. Two-way communications systems send signals to smart appliances, system monitoring devices and customer messaging orbs to affect customer usage patterns. Fiber-to-the-home (FTTH) and wireless systems communicate meter usage in near real time to enable monitoring for abnormal consumption patterns. If customers have all of this data at their fingertips, what more will they expect from their utility service professionals? Advanced metering infrastructure (AMI) and two-way communications between customer and utility provider are essential to the future of these innovations. Figure 1 indicates the penetration of advanced metering by region.

A TOUCH OF ORWELL

This brave new world is not without risk. Tremendous amounts of data will be acquired and maintained. Monthly usage habits of consumers can provide incredible insight into customers’ lives – imagine the knowledge that real-time data can provide. As marketers begin to understand the powerful communications channels utilities possess, partnerships will emerge to maximize their value. Privacy laws and regulations defining proper use and misuse of data similar to Customer Private Network Information (CPNI) legislation will emerge just as they did in the telecommunications industry. Thus, it would be wise for the utility industry to take steps to limit use prior to legislative mandates being enacted that would create barriers to practical use.

EMERGING BUSINESSES CREATING VALUE FOR CUSTOMERS

Many of the technologies discussed in this paper already exist; the future will simply make their application more common – the interesting part will come in seeing how these products and services are bundled and who will provide them. Over the next 10 years, many new services (and a few new spins on old ones) will be offered to the consumer via this new infrastructure. The array of service offerings will be as broad as the capabilities that are created through the utilities infrastructure design. Utilities offering only one-way communication from the meter will be limited, while utilities with two-way communication riding their own fiber-optic systems will find a vast number of opportunities. Some of these services will fall within the core competency of the utility and be a natural fit in creating new revenue streams; others will require new partnerships to enable their existence. Some will span residential, commercial and industrial market segments, while others will be tailored to the residential customer only.

Energy management and consulting services will flourish during the initial period, especially in areas where time-of-use rates are incorporated in all market segments. Cable, Internet, telephone and security services will consolidate in areas where fiber-to-the-home is part of the infrastructure. Utilities’ ability to provide these services may be greatly effected by their legal and regulatory structures. Where limitations are imposed related to scope and type of services, partnerships will be formed to enable cost-effective service. Figure 2 shows what utilities reported to be the most common AMI system usages in a recent Federal Energy Regulatory Commission (FERC) survey.

As shown in Figure 2, load control, demand response monitoring and notification of price changes are already a part of the system capabilities. As an awareness of energy efficiency develops, a new focus on conservation will give rise to a newfound interest in smart appliances. Their operational characteristics will be more sophisticated than the predecessors of the “cycle and save” era, and they will meet customers’ demand for energy savings and environmental friendliness. This will not be limited to water heaters and heating, venting and air-conditioning (HVAC) units. The new initiatives will encompass refrigerators, freezers, washers, dryers and other second-order appliances, driving conservation derived from time-of-day use to a new level. And these initiatives will not be limited to electricity.

IMPACTS OF TECHNOLOGICAL CHANGE ON OTHER UTILITIES

Very few utility services will be exempt from the impact of changes in the electric industry. Natural gas and water usage, too, will be impacted as the nation focuses its attention on the efficient use of resources. Natural gas time-of-use rates will rise along with interruptible rates for residential consumers. This may take 10 to 15 years to occur, and a declining usage trend will need to be reversed; however, the same infrastructure restraints and concerns that plague the electric industry will be recognized in the natural gas industry as well. Thus, we can expect energy providers to adopt these rates in the future to stay competitive. If the electric systems are able to shift peak usage and levelize loads, the need for natural gas-fired generation will diminish. Natural gas-fired generation plants for system peaking would become unnecessary, and the decrease in demand would assist in stabilizing natural gas pricing.

Water availability issues are no longer limited to the Western United States, with areas such as Atlanta now beginning to experience water shortages as well. As a result, reverse-step rates that encourage water usage are being replaced with fixed and progressive step-rate structures to encourage water conservation. Automated metering can assist in eliminating waste, identifying excessive use during curtailment periods and creating a more efficient water distribution system. As energy time-of-use rates are implemented, water and wastewater treatment plants may find efficiencies in offering time-of-use rates as well in order to shape the usage characteristics of their customers without adding increased facilities. Even if this does not occur, time-of-use shifting of electrical load will have an impact on water usage patterns and effectively change water and wastewater operational characteristics.

In a world of increasing environmental vulnerability, the ability to monitor backflow in water metering will be essential in our efforts to be environmentally safe and monitor domestic threats to the water supply. Although technology’s ability to identify such threats will not prevent their occurrence, it will help utilities evaluate events and respond in order to isolate and diminish possible future threats.

IMPLICATIONS FOR UTILITIES

The above-described technological innovations don’t come without an impact to the service side of utilities. It will be difficult at best for utilities to modify legacy systems to take advantage of the benefits found in new technologies. More robust computer systems implemented in preparation for Y2K will be capable of some modifications; however, new software offerings are being designed today to address the vast opportunities that will soon exist. Processes for data management, storage and retrieval and use will need to be developed. And a new breed of customer service representative will begin to evolve. New technologies, near realtime information available to the consumer, unique customer and appliance configurations, and partnerships and services that go beyond the core competencies of the current workforce will create a short-term gap in trained customer service professionals. Billing departments will expand as rates become more complex. And the increased flexibility of customer information systems will require extensive checks and verifications to ensure accuracy.

Figure 3 (created by Robert Pratt of Pacific Northwest National Laboratory) provides a picture of the new landscape being created by the technologies utilities are implementing and the implications they have for customers.

Utilities with completely integrated systems will be the biggest winners in the future. Network management; geographic information systems; customer information systems; work order systems; supervisory control and data acquisition (SCADA) systems; and financial systems that communicate openly will be positioned to recognize the early wins that will spark the next decade of innovation. Cost-to-serve models continue to resonate as a popular topic among utility providers, and the impact of new technology will assist in making this integral to financial success.

The processes underlying current policies and procedures were designed for the way utilities traditionally operated – which is precisely why today’s utilities must take a systematic approach to re-evaluating their business processes if they’re to take advantage of new technology. They’ll even need to consider the cost of providing a detailed bill and mail delivery. The existence of real-time readings may bring dramatic changes in payment processing. Prepay accounts may eliminate the need to require deposits or assume risk for uncollectible accounts. Daily, weekly and semi-monthly payments may bring added cost (as may allowing customers to choose their due dates in the traditional arrears billing model); thus, utilities must consider the implications of these actions on cash fl ow and risk before implementing them. Advance notice of service interruption due to planned maintenance or construction can be communicated electronically over two-way automated meter reading (AMR) systems to orbs, communication panels, computers or other means. These same capabilities will dramatically change credit and collections efforts over the next 10 years. Electronic notification of past due accounts, shut-off and reconnection can all be done remotely at little cost to the utility.

IMPLICATIONS FOR CONSUMERS

Customers and commercial marketing efforts will be the driving forces for much of the innovation we’ll witness in coming years. No longer are customers simply comparing utilities against each other; today, they’re comparing utility customer service with their best and worst customer experiences regardless of industry. This means that customers are comparing a utility’s website capabilities with Amazon. com and its service response with the Ritz Carlton, Holiday Inn or Marriott they might frequent. Service reliability is measured against FedEx. Customer service expectations are raised with every initiative of competitive enterprise – a fact utilities will have to come to terms with if they’re to succeed.

All customers are not created equal. Technologically advanced customers will find the future exciting, while customers who view their utility as just another service provider will find it complicated and at times overwhelming. Utilities must communicate with customers at all levels to adequately prepare them for a future that’s already arrived.

Achieving Decentralized Coordination In the Electric Power Industry

For the past century, the dominant business and regulatory paradigms in the electric power industry have been centralized economic and physical control. The ideas presented here and in my forthcoming book, Deregulation, Innovation, and Market Liberalization: Electricity Restructuring in a Constantly Evolving Environment (Routledge, 2008), comprise a different paradigm – decentralized economic and physical coordination – which will be achieved through contracts, transactions, price signals and integrated intertemporal wholesale and retail markets. Digital communication technologies – which are becoming ever more pervasive and affordable – are what make this decentralized coordination possible. In contrast to the “distributed control” concept often invoked by power systems engineers (in which distributed technology is used to enhance centralized control of a system), “decentralized coordination” represents a paradigm in which distributed agents themselves control part of the system, and in aggregate, their actions produce order: emergent order. [1]

Dynamic retail pricing, retail product differentiation and complementary end-use technologies provide the foundation for achieving decentralized coordination in the electric power industry. They bring timely information to consumers and enable them to participate in retail market processes; they also enable retailers to discover and satisfy the heterogeneous preferences of consumers, all of whom have private knowledge that’s unavailable to firms and regulators in the absence of such market processes. Institutions that facilitate this discovery through dynamic pricing and technology are crucial for achieving decentralized coordination. Thus, retail restructuring that allows dynamic pricing and product differentiation, doesn’t stifle the adoption of digital technology and reduces retail entry barriers is necessary if this value-creating decentralized coordination is to happen.

This paper presents a case study – the “GridWise Olympic Peninsula Testbed Demonstration Project” – that illustrates how digital end-use technology and dynamic pricing combine to provide value to residential customers while increasing network reliability and reducing required infrastructure investments through decentralized coordination. The availability (and increasing cost-effectiveness) of digital technologies enabling consumers to monitor and control their energy use and to see transparent price signals has made existing retail rate regulation obsolete. Instead, the policy recommendation that this analysis implies is that regulators should reduce entry barriers in retail markets and allow for dynamic pricing and product differentiation, which are the keys to achieving decentralized coordination.

THE KEYS: DYNAMIC PRICING, DIGITAL TECHNOLOGY

Dynamic pricing provides price signals that reflect variations in the actual costs and benefits of providing electricity at different times of the day. Some of the more sophisticated forms of dynamic pricing harness the dramatic improvements in information technology of the past 20 years to communicate these price signals to consumers. These same technological developments also give consumers a tool for managing their energy use, in either manual or automated form. Currently, with almost all U.S. consumers (even industrial and commercial ones) paying average prices, there’s little incentive for consumers to manage their consumption and shift it away from peak hours. This inelastic demand leads to more capital investment in power plants and transmission and distribution facilities than would occur if consumers could make choices based on their preferences and in the face of dynamic pricing.

Retail price regulation stifles the economic processes that lead to both static and dynamic efficiency. Keeping retail prices fixed truncates the information flow between wholesale and retail markets, and leads to inefficiency, price spikes and price volatility. Fixed retail rates for electric power service mean that the prices individual consumers pay bear little or no relation to the marginal cost of providing power in any given hour. Moreover, because retail prices don’t fluctuate, consumers are given no incentive to change their consumption as the marginal cost of producing electricity changes. This severing of incentives leads to inefficient energy consumption in the short run and also causes inappropriate investment in generation, transmission and distribution capacity in the long run. It has also stifled the implementation of technologies that enable customers to make active consumption decisions, even though communication technologies have become ubiquitous, affordable and user-friendly.

Dynamic pricing can include time-of-use (TOU) rates, which are different prices in blocks over a day (based on expected wholesale prices), or real-time pricing (RTP) in which actual market prices are transmitted to consumers, generally in increments of an hour or less. A TOU rate typically applies predetermined prices to specific time periods by day and by season. RTP differs from TOU mainly because RTP exposes consumers to unexpected variations (positive and negative) due to demand conditions, weather and other factors. In a sense, fixed retail rates and RTP are the end points of a continuum of how much price variability the consumer sees, and different types of TOU systems are points on that continuum. Thus, RTP is but one example of dynamic pricing. Both RTP and TOU provide better price signals to customers than current regulated average prices do. They also enable companies to sell, and customers to purchase, electric power service as a differentiated product.

TECHNOLOGY’S ROLE IN RETAIL CHOICE

Digital technologies are becoming increasingly available to reduce the cost of sending prices to people and their devices. The 2007 Galvin Electricity Initiative report “The Path to Perfect Power: New Technologies Advance Consumer Control” catalogs a variety of end-user technologies (from price-responsive appliances to wireless home automation systems) that can communicate electricity price signals to consumers, retain data on their consumption and be programmed to respond automatically to trigger prices that the consumer chooses based on his or her preferences. [2] Moreover, the two-way communication advanced metering infrastructure (AMI) that enables a retailer and consumer to have that data transparency is also proliferating (albeit slowly) and declining in price.

Dynamic pricing and the digital technology that enables communication of price information are symbiotic. Dynamic pricing in the absence of enabling technology is meaningless. Likewise, technology without economic signals to respond to is extremely limited in its ability to coordinate buyers and sellers in a way that optimizes network quality and resource use. [3] The combination of dynamic pricing and enabling technology changes the value proposition for the consumer from “I flip the switch, and the light comes on” to a more diverse and consumer-focused set of value-added services.

These diverse value-added services empower consumers and enable them to control their electricity choices with more granularity and precision than the environment in which they think solely of the total amount of electricity they consume. Digital metering and end-user devices also decrease transaction costs between buyers and sellers, lowering barriers to exchange and to the formation of particular markets and products.

Whether they take the form of building control systems that enable the consumer to see the amount of power used by each function performed in a building or appliances that can be programmed to behave differently based on changes in the retail price of electricity, these products and services provide customers with an opportunity to make better choices with more precision than ever before. In aggregate, these choices lead to better capacity utilization and better fuel resource utilization, and provide incentives for innovation to meet customers’ needs and capture their imaginations. In this sense, technological innovation and dynamic retail electricity pricing are at the heart of decentralized coordination in the electric power network.

EVIDENCE

Led by the Pacific Northwest National Laboratory (PNNL), the Olympic Peninsula GridWise Testbed Project served as a demonstration project to test a residential network with highly distributed intelligence and market-based dynamic pricing. [4] Washington’s Olympic Peninsula is an area of great scenic beauty, with population centers concentrated on the northern edge. The peninsula’s electricity distribution network is connected to the rest of the network through a single distribution substation. While the peninsula is experiencing economic growth and associated growth in electricity demand, the natural beauty of the area and other environmental concerns served as an impetus for area residents to explore options beyond simply building generation capacity on the peninsula or adding transmission capacity.

Thus, this project tested how the combination of enabling technologies and market-based dynamic pricing affected utilization of existing capacity, deferral of capital investment and the ability of distributed demand-side and supply-side resources to create system reliability. Two questions were of primary interest:

1) What dynamic pricing contracts do consumers find attractive, and how does enabling technology affect that choice?

2) To what extent will consumers choose to automate energy use decisions?

The project – which ran from April 2006 through March 2007 – included 130 broadband-enabled households with electric heating. Each household received a programmable communicating thermostat (PCT) with a visual user interface that allowed the consumer to program the thermostat for the home – specifically to respond to price signals, if desired. Households also received water heaters equipped with a GridFriendly appliance (GFA) controller chip developed at PNNL that enables the water heater to receive price signals and be programmed to respond automatically to those price signals. Consumers could control the sensitivity of the water heater through the PCT settings.

These households also participated in a market field experiment involving dynamic pricing. While they continued to purchase energy from their local utility at a fixed, discounted price, they also received a cash account with a predetermined balance, which was replenished quarterly. The energy use decisions they made would determine their overall bill, which was deducted from their cash account, and they were able to keep any difference as profit. The worst a household could do was a zero balance, so they were no worse off than if they had not participated in the experiment. At any time customers could log in to a secure website to see their current balances and determine the effectiveness of their energy use strategies.

On signing up for the project, the households received extensive information and education about the technologies available to them and the kinds of energy use strategies facilitated by these technologies. They were then asked to choose a retail pricing contract from three options: a fixed price contract (with an embedded price risk premium), a TOU contract with a variable critical peak price (CPP) component that could be called in periods of tight capacity or an RTP contract that would reflect a wholesale market-clearing price in five-minute intervals. The RTP was determined using a uniform price double auction in which buyers (households and commercial) submit bids and sellers submit offers simultaneously. This project represented the first instance in which a double auction retail market design was tested in electric power.

The households ranked the contracts and were then divided fairly evenly among the three types, along with a control group that received the enabling technologies and had their energy use monitored but did not participate in the dynamic pricing market experiment. All households received either their first or second choice; interestingly, more than two-thirds of the households ranked RTP as their first choice. This result counters the received wisdom that residential customers want only reliable service at low, stable prices.

According to the 2007 report on the project by D.J. Hammerstrom (and others), on average participants saved 10 percent on their electricity bills. [5] That report also includes the following findings about the project:

Result 1. For the RTP group, peak consumption decreased by 15 to 17 percent relative to what the peak would have been in the absence of the dynamic pricing – even though their overall energy consumption increased by approximately 4 percent. This flattening of the load duration curve indicates shifting some peak demand to nonpeak hours. Such shifting increases the system’s load factor, improving capacity utilization and reducing the need to invest in additional capacity, for a given level of demand. A 15 to 17 percent reduction is substantial and is similar in magnitude to the reductions seen in other dynamic pricing pilots.

After controlling for price response, weather effects and weekend days, the RTP group’s overall energy consumption was 4 percent higher than that of the fixed price group. This result, in combination with the load duration effect noted above, indicates that the overall effect of RTP dynamic pricing is to smooth consumption over time, not decrease it.

Result 2. The TOU group achieved both a large price elasticity of demand (-0.17), based on hourly data, and an overall energy reduction of approximately 20 percent relative to the fixed price group.

After controlling for price response, weather effects and weekend days, the TOU group’s overall energy consumption was 20 percent lower than that of the fixed price group. This result indicates that the TOU (with occasional critical peaks) pricing induced overall conservation – a result consistent with the results of the California SPP project. The estimated price elasticity of demand in the TOU group was -0.17, which is high relative to that observed in other projects. This elasticity suggests that the pricing coupled with the enabling end-use technology amplifies the price responsiveness of even small residential consumers.

Despite these results, dynamic pricing and enabling technologies are proliferating slowly in the electricity industry. Proliferation requires a combination of formal and informal institutional change to overcome a variety of barriers. And while formal institutional change (primarily in the form of federal legislation) is reducing some of these barriers, it remains an incremental process. The traditional rate structure, fixed by state regulation and slow to change, presents a substantial barrier. Predetermined load profiles inhibit market-based pricing by ignoring individual customer variation and the information that customers can communicate through choices in response to price signals. Furthermore, the persistence of standard offer service at a discounted rate (that is, a rate that does not reflect the financial cost of insurance against price risk) stifles any incentive customers might have to pursue other pricing options.

The most significant – yet also most intangible and difficult-to-overcome – obstacle to dynamic pricing and enabling technologies is inertia. All of the primary stakeholders in the industry – utilities, regulators and customers – harbor status quo bias. Incumbent utilities face incentives to maintain the regulated status quo as much as possible (given the economic, technological and demographic changes surrounding them) – and thus far, they’ve been successful in using the political process to achieve this objective.

Customer inertia also runs deep because consumers have not had to think about their consumption of electricity or the price they pay for it – a bias consumer advocates generally reinforce by arguing that low, stable prices for highly reliable power are an entitlement. Regulators and customers value the stability and predictability that have arisen from this vertically integrated, historically supply-oriented and reliability-focused environment; however, what is unseen and unaccounted for is the opportunity cost of such predictability – the foregone value creation in innovative services, empowerment of customers to manage their own energy use and use of double-sided markets to enhance market efficiency and network reliability. Compare this unseen potential with the value creation in telecommunications, where even young adults can understand and adapt to cell phone-pricing plans and benefit from the stream of innovations in the industry.

CONCLUSION

The potential for a highly distributed, decentralized network of devices automated to respond to price signals creates new policy and research questions. Do individuals automate sending prices to devices? If so, do they adjust settings, and how? Does the combination of price effects and innovation increase total surplus, including consumer surplus? In aggregate, do these distributed actions create emergent order in the form of system reliability?

Answering these questions requires thinking about the diffuse and private nature of the knowledge embedded in the network, and the extent to which such a network becomes a complex adaptive system. Technology helps determine whether decentralized coordination and emergent order are possible; the dramatic transformation of digital technology in the past few decades has decreased transaction costs and increased the extent of feasible decentralized coordination in this industry. Institutions – which structure and shape the contexts in which such processes occur – provide a means for creating this coordination. And finally, regulatory institutions affect whether or not this coordination can occur.

For this reason, effective regulation should focus not on allocation but rather on decentralized coordination and how to bring it about. This in turn means a focus on market processes, which are adaptive institutions that evolve along with technological change. Regulatory institutions should also be adaptive, and policymakers should view regulatory policy as work in progress so that the institutions can adapt to unknown and changing conditions and enable decentralized coordination.

ENDNOTES

1. Order can take many forms in a complex system like electricity – for example, keeping the lights on (short-term reliability), achieving economic efficiency, optimizing transmission congestion, longer-term resource adequacy and so on.

2. Roger W. Gale, Jean-Louis Poirier, Lynne Kiesling and David Bodde, “The Path to Perfect Power: New Technologies Advance Consumer Control,” Galvin Electricity Initiative report (2007). www.galvinpower.org/resources/galvin.php?id=88

3. The exception to this claim is the TOU contract, where the rate structure is known in advance. However, even on such a simple dynamic pricing contract, devices that allow customers to see their consumption and expenditure in real time instead of waiting for their bill can change behavior.

4. D.J. Hammerstrom et. al, “Pacific Northwest GridWise Testbed Demonstration Projects, volume I: The Olympic Peninsula Project” (2007). http://gridwise.pnl.gov/docs/op_project_final_report_pnnl17167.pdf

5. Ibid.

How Intelligent Is Your Grid?

Many people in the utility industry see the intelligent grid — an electric transmission and distribution network that uses information technology to predict and adjust to network changes — as a long-term goal that utilities are still far from achieving. Energy Insights research, however, indicates that today’s grid is more intelligent than people think. In fact, utilities can begin having the network of the future today by better leveraging their existing resources and focusing on the intelligent-grid backbone.

DRIVERS FOR THE INTELLIGENT GRID

Before discussing the intelligent grid backbone, it’s important to understand the drivers directing the intelligent grid’s progress. While many groups — such as government, utilities and technology companies — may be pushing the intelligent grid forward, they are also slowing it down. Here’s how:

  • Government. With the 2005 U.S. Energy Policy Act and the more recent 2007 Energy Independence and Security Act, the federal government has acknowledged the intelligent grid’s importance and is supporting investment in the area. Furthermore, public utility commissions (PUCs) have begun supporting intelligent grid investments like smart metering. At the same time, however, PUCs have a duty to maintain reasonable prices. Since utilities have not extensively tested the benefits of some intelligent grid technologies, such as distribution line sensors, many regulators hesitate to support utilities investing in intelligent grid technologies beyond smart metering.
  • Utilities. Energy Insights research indicates that information technology, in general, enables utilities to increase operational efficiency and reduce costs. For this reason, utilities are open to information technology; however, they’re often looking for quick cost recovery and benefits. Many intelligent grid technologies provide longer-term benefits, making them difficult to cost-justify over the short term. Since utilities are risk-aware, this can make intelligent grid investments look riskier than traditional information technology investments.
  • Technology. Although advanced enough to function on the grid today, many intelligent grid technologies could become quickly outdated thanks to the rapidly developing marketplace. What’s more, the life span of many intelligent grid technologies is not as long as those of traditional grid assets. For example, a smart meter’s typical life span is about 10 to 15 years, compared with 20 to 30 years for an electro-mechanical meter.

With strong drivers and competing pressures like these, it’s not a question of whether the intelligent grid will happen but when utilities will implement new technologies. Given the challenges facing the intelligent grid, the transition will likely be more of an evolution than a revolution. As a result, utilities are making their grids more intelligent today by focusing on the basics, or the intelligent grid backbone.

THE INTELLIGENT GRID BACKBONE

What comprises this backbone? Answering this question requires a closer look at how intelligence changes the grid. Typically, a utility has good visibility into the operation of its generation and transmission infrastructure but poor visibility into its distribution network. As a result, the utility must respond to a changing distribution network based on very limited information. Furthermore, if a grid event requires attention — such as in the case of a transformer failure — people must review information, decide to act and then manually dispatch field crews. This type of approach translates to slower, less informed reactions to grid events.

The intelligent grid changes these reactions through a backbone of technologies — sensors, communication networks and advanced analytics — especially developed for distribution networks. To better understand these changes, we can imagine a scenario where a utility has an outage on its distribution network. As shown in Figure 1, additional grid sensors collect more information, making it easier to detect problems. Communications networks then allow sensors to convey the problem to the utility. Advanced analytics can efficiently process this information and determine more precisely where the fault is located, as well as automatically respond to the problem and dispatch field crews. These components not only enable faster, better-informed reactions to grid problems, they can also do real-time pricing, improve demand response and better handle distributed and renewable energy sources.

A CLOSER LOOK AT BACKBONE COMPONENTS

A deeper dive into each of these intelligent grid backbone technologies reveals how utilities are gaining more intelligence about their grid today.

Network sensors are important not only for real-time operations — such as locating faults and connecting distributed energy sources to the grid — but also for providing a rich historical data source to improve asset maintenance and load research and forecasting. Today, more utilities are using sensors to better monitor their distribution networks; however, they’re focused primarily on smart meters. The reason for this is that smart meters have immediate operational benefits that make them attractive for many utilities today, including reducing meter reader costs, offering accurate billing information, providing theft control and satisfying regulatory requirements. Yet this focus on smart meters has created a monitoring gap between the transmission network and the smart meter.

A slew of sensors are available from companies such as General Electric, ABB, PowerSense, GridSense and Serveron to fill this monitoring gap. Tracking everything from load balancing and transformer status to circuit breakers and tap changers, energized downed lines, high-impedance faults and stray voltage, and more, these sensors are able to fill the monitoring gap, yet utilities hesitate to invest in them because they lack the immediate operational benefits of smart meters.

By monitoring this gap, however, utilities will sustain longer-term grid benefits such as reduced generation capacity building. Utilities have found they can begin monitoring this gap by:

  • Prioritizing sensor investments. Customer complaints and regulatory pressure have pushed some utilities to take action for particular parts of their service territory. For example, one utility Energy Insights studied received numerous customer complaints about a particular feeder’s reliability, so the utility invested in line sensors for that area. Another utility began considering sensor investments in troubled areas of its distribution network when regulators demanded that the utility raise its System Average Interruption Frequency Index (SAIFI) and System Average Interruption Duration Index (SAIDI) ratings from the bottom 50 percent to the top 25 percent of benchmarked utilities. By focusing on such areas, utilities can achieve “quick wins” with sensors and build utility confidence by using additional sensors on their distribution grid.
  • Realizing it’s all about compromise. Even in high-priority areas, it may not make financial sense for a utility to deploy the full range of sensors for every possible asset. In some situations, utilities may target a particular area of the service territory with a higher density of sensors. For example, a large U.S. investor-owned utility with a medium voltage-sensing program placed a high density of sensors along a specific section of its service territory. On the other hand, utilities might cover a broader area of service territory with fewer sensors, similar to the approach taken by a large investor-owned utility Energy Insights looked at that monitored only transformers across its service territory.
  • Rolling in sensors with other intelligent grid initiatives. Some utilities find ways to combine their smart metering projects with other distribution network sensors or to leverage existing investments that could support additional sensors. One utility that Energy Insights looked at installed transformer sensors along with a smart meter initiative and leveraged the communications networks it used for smart metering.

While sensors provide an important means of capturing information about the grid, communication networks are critical to moving that information throughout the intelligent grid — whether between sensors or field crews. Typically, to enable intelligent grid communications, utilities must either build new communications networks to bring intelligence to the existing grid or incorporate communication networks into new construction. Yet utilities today are also leveraging existing or recently installed communications networks to facilitate more sophisticated intelligent grid initiatives such as the following:

  • Smart metering and automated meter-reading (AMR) initiatives. With the current drive to install smart meters, many utilities are covering their distribution networks with communications infrastructure. Furthermore, existing AMR deployments may include communications networks that can bring data back to the utility. Some utilities are taking advantage of these networks to begin plugging other sensors into their distribution networks.
  • Mobile workforce. The deployment of mobile technologies for field crews is another hot area for utilities right now. Utilities are deploying cellular networks for field crew communications for voice and data. Although utilities have typically been hesitant to work with third-party communications providers, they’ve become more comfortable with outside providers after using them for their mobile technologies. Since most of the cellular networks can provide data coverage as well, some utilities are beginning to use these providers to transmit sensor information across their distribution networks.

Since smart metering and mobile communications networks are already in place, the incremental cost of installing sensors on these networks is relatively low. The key is making sure that different sensors and components can plug into these networks easily (for example, using a standard communications protocol).

The last key piece of the intelligent grid backbone is advanced analytics. Utilities are required to make quick decisions every day if they’re to maintain a safe and reliable grid, and the key to making such decisions is being well informed. Intelligent grid analytics can help utilities quickly process large amounts of data from sensors so that they can make those informed decisions. However, how quickly a decision needs to be made depends on the situation. Intelligent grid analytics assist with two types of decisions: very quick decisions (veQuids) and quick decisions (Quids). veQuids are made in milliseconds by computers and intelligent devices analyzing complex, real-time data – an intelligent grid vision that’s still a future development for most utilities.

Fortunately, many proactive decisions about the grid don’t have to be made in milliseconds. Many utilities today can make Quids — often manual decisions — to predict and adjust to network changes within a time frame of minutes, days or even months.

no matter how quick the decision, however, all predictive efforts are based on access to good-quality data. In putting their Quid capabilities to use today — in particular for predictive maintenance and smart metering — utilities are building not only intelligence about their grids but also a foundation for providing more advanced veQuids analytics in the future through the following:

  • The information foundation. Smart metering and predictive maintenance require utilities to collect not only more data but also more real-time data. Smart metering also helps break down barriers between retail and operational data sources, which in turn creates better visibility across many data sources to provide a better understanding of a complex grid.
  • The automation transition. To make the leap between Quids and veQuids requires more than just better access to more information — it also requires automation. While fully automated decision-making is still a thing of the future, many utilities are taking steps to compile and display data automatically as well as do some basic analysis, using dashboards from providers such as OSIsoft and Obvient Strategies to display high-level information customized for individual users. The user then further analyzes the data, and makes decisions and takes action based on that analysis. Many utilities today use the dashboard model to monitor critical assets based on both real-time and historical data.

ENSURING A MORE INTELLIGENT GRID TODAY AND TOMORROW

As these backbone components show, utilities already have some intelligence on their grids. now, they’re building on that intelligence by leveraging existing infrastructure and resources — whether it’s voice communications providers for data transmission or Quid resources to build a foundation for the veQuids of tomorrow. In particular, utilities need to look at:

  • Scalability. Utilities need to make sure that whatever technologies they put on the grid today can grow to accommodate larger portions of the grid in future.
  • Flexibility. Given rapid technology changes in the marketplace, utilities need to make sure their technology is flexible and adaptable. For example, utilities should consider smart meters that have the ability to change out communications cards to allow for new technologies.
  • Integration. due to the evolutionary nature of the grid, and with so many intelligent grid components that must work together (intelligent sensors at substations, transformers and power lines; smart meters; and distributed and renewable energy sources), utilities need to make sure these disparate components can work with one another. Utilities need to consider how to introduce more flexibility into their intelligent grids to accommodate the increasingly complex network of devices.

As today’s utilities employ targeted efforts to build intelligence about the grid, they must keep in mind that whatever action they take today – no matter how small – must ultimately help them meet the demands of tomorrow.

Making Change Work: Why Utilities Need Change Management

Many times organizations are reluctant to engage change management programs, plans and teams. More often, change management programs are launched too late in the project process, are only moderately funded or are absorbed within the team as part-time responsibilities – all of which we’ve seen happen time and again in the utility industry.

“Making Change Work,” an IBM study done in collaboration with the Center of Evaluation and Methods at Bonn University, analyzed the factors for successful implementation of change. The scope of this study, released in 2007, is now being expanded because the project management and change management professions, formerly aligned, are now at a turning point of differentiation. The reason is simple: too many projects fail to consider both components as critical to success – and therefore lack insight into the day-today impact of a change on members of the organization.

Despite this, many organizations have been reluctant to implement change management programs, plans and teams. And when they have put such programs in place, the programs tend to be launched too late in the project process, are inadequately funded or are perceived as part-time tasks that can be assigned to members of the project management team.

WHAT IS CHANGE MANAGEMENT?

Change management is a structured approach to business transformation that manages the transition from a current state to a desired future state. Far from being static or rigid, change management is an ever-evolving program that varies with the needs of the organization. Effective change management involves people and provides open communication.

Change management is as important as project management. However, whereas project management is a tactical activity, change management represents a strategic initiative. To understand the difference, consider the following

  • Change management is the process of driving corporate strategy by identifying, addressing and managing barriers to change across the organization or enterprise.
  • Project management is the process of implementing the tools needed to enable or mobilize the corporate strategy.

Change management is an ongoing process that works in close concert with project management. At any given time at least one phase of change management should be occurring. More likely, multiple phases will be taking place across various initiatives.

A change management program can be tailored to manage the needs of the organizational culture and relationships. The program must close the gaps among workforce, project team and sponsor leadership during all phases of all projects. It does this by:

  • Ensuring proper alignment of the organization with new technology and process requirements;
  • Preparing people for new processes and technology through training and communication;
  • Identifying and addressing human resource implications such as job definitions, union negotiations and performance measures;
  • Managing the reaction of both individuals and the entire organization to change; and
  • Providing the right level of support for ongoing implementation success.

The three fundamental activities of a change management program are leading, communicating and engaging. These three activities should span the project life cycle to maintain both awareness of the change and its momentum (Figure 1).

KEY ELEMENTS OF A CHANGE PROGRAM

There are three best practice elements that make the difference between successful projects and less successful projects: [1]

Organizational awareness for the challenges inherent in any change. This involves the following:

  • Getting a real understanding of – and leadership buy-in to – the stakeholders and culture;
  • Recognizing the interdependence of strategy and execution;
  • Ensuring an integrated strategy approach linking business strategy, operations, organization design and change and technology strategy; and
  • Educating leadership on change requirements and commitment.

Consistent use of formal methods for change management. This should include:

  • Covering the complete life cycle – from definition to deployment to post-implementation optimization;
  • Allowing for easy customization and flexibility through a modular design;
  • Incorporating change management and value realization components into each phase to increase the likelihood of success; and
  • Providing a published plan with ongoing accountability and sponsorship as well as continuous improvement.

A specified share of the project budget that is invested in change management. This should involve:

  • Investing in change linked to project success. Projects that invest more than 10 percent of the project budget have an average of 45 percent success (Figure 2). [2]
  • Assigning the right resources to support change management early on and maintaining the required support. This also limits the adverse impacts of change on an organization’s productivity (Figure 3). [3]

WHY DO UTILITIES NEED CHANGE MANAGEMENT?

Utilities today face a unique set of challenges. For starters, they’re simultaneously dealing with aging infrastructures and aging workforces. In addition, there are market pressures to improve performance, become more “green” and mitigate rising energy costs. To address these realities, many utilities are seeking mergers and acquisition (M&A) opportunities as well as implementing new technologies.

The cost cutting of the past decade combined with M&As has left utilities with gaps in workforce experience as well as budget challenges. Yet utilities are facing major business disruptions going into the next decade and beyond. To cope with these disruptions, companies are implementing new technologies such as the intelligent grid, advanced metering infrastructure (AMI), meter data management (MDM), enterprise asset management (EAM) and work management systems (WMS’s). It’s not uncommon for utilities to be implementing multiple new systems simultaneously that affect the day-to-day activities of people throughout the organization, from frontline workers to senior managers.

A change management program can address a number of challenges specific to the utilities industry.

CULTURAL CLIMATE: ‘BUT WE’RE DIFFERENT’

A utility is a utility is a utility. But a deeper look into individual businesses reveals nuances in their relationships with both internal and external stakeholders that are unique to each company. A change management team must intimately understand these relationships. For example, externally how is the utility perceived by regulators, customers, the community and even analysts? As for internal relationships, how do various operating divisions relate and work together? Some operating divisions work well together on project teams and respect each other and their differences; others do not.

There may be cultural differences, but work is work. Only change management can address these relationships. Knowing the utility’s cultural climate and relationships will help shape each phase of the change management program, and allow change management professionals to customize a project or system implementation to fit a company’s culture.

REGULATORY LANDSCAPE

With M&As and increasing market pressures across the United States, the regulatory landscape confronting utilities is becoming more variable. We’ve seen several types of regulatory-related challenges.

Regulatory pressure. Whether regulators mandate or simply encourage new technology implementations can make a significant difference in how stakeholders in a project behave. In general, there’s more resistance to a new technology when it’s required versus voluntarily implemented. Change management can help work through participant behaviors and mitigate obstacles so that project work can continue as planned.

Multiple regulatory jurisdictions. Many utilities with recently expanded footprints following M&As now have to manage requests from and expectations of multiple regulatory commissions. Often these commissions have different mandates. Change management initiatives are needed to work through the complexity of expectations, manage multiple regulatory relationships and drive utilities toward a unified corporate strategy.

Regulatory evolution. Just as markets evolve, so do regulatory influences and mandates. Often regulators will issue orders that can be interpreted in many ways. They may even do this to get information in the form of reactions from their various constituents. Whatever the reason, the reality is that utilities are managing an ever-changing portfolio of regulations. Change management can better prepare utilities for this constant change.

OPERATIONS MATURITY

When new systems and technologies being implemented encompass multiple operating divisions, it can be difficult for stakeholders to agree on operating standards or processes. Project team members representing the various operating regions can resist compromise for fear of losing control. This often occurs when utilities are attempting to integrate systems across operating regions following an acquisition.

Change management helps ensure that various constituents – for example, the regional operating divisions – are prepared for eminent business transformation. In large organizations, this preparation period can take a year or more. But for organizations to realize the benefits of new systems and technology implementations, they must be ready to receive the benefits. Readiness and preparedness are largely the responsibilities of the change management team.

ORGANIZATIONAL COHESIVENESS

The notion of organizational cohesiveness is that across the organization all constituents are equally committed to the business transformation initiative and have the same understanding of the overarching corporate strategy while also performing their individual roles and responsibilities.

Senior executives must align their visions and common commitment to change. After all, they set the tone for change through their respective organizations. If they are not in sync with each other, their organizations become silos, and business processes are less likely to be fluid across organizational boundaries. Frontline managers and associates must, in turn, be engaged and enthusiastic about the transformations to come.

Organizational cohesiveness is especially critical during large systems implementations involving utility field operations. Leaders at multiple locations must be ready to communicate and support change – and this support must be visible to the workforce. Utilities must understand this requirement at the beginning of a project to make change manageable, realistic and personal enough to sustain momentum. All too often, we’ve heard team members comment, “We had a lot of leadership at the project kickoff, but we really haven’t seen leadership at any of our activities or work locations since then. The project team tells us what to do.”

Moreover, leadership – when removed from the project – usually will not admit that they’re in the dark about what’s going on. Yet their lack of involvement will not escape the attention of frontline employees. Once the supervisor is perceived as lacking information – and therefore power – it’s all over. Improving customer service and quality, cutting costs and adopting new technology-merging operations all require changing employees. [4]

For utilities, the concept of organizational cohesiveness is especially important because just as much technology “lives” outside IT as inside. Yet the engineers who use this non-IT-controlled technology – what Gartner calls “operations technology” – are usually disconnected from the IT world in terms of both practical planning and execution. However, these worlds must act as one for a company to be truly agile. [5]

Change management methods and tools ensure that organization cohesiveness exists through project implementation and beyond.

UNION ENGAGEMENT

Successful change occurs with a sustained partnership among union representatives throughout the project life cycle. Project leadership and union leadership must work together and partner to implement change. Union representation should be on the project team. Representatives can be involved in process reviews, testing and training, or asked to serve as change champions. In addition, communication is critical throughout all phases of a project. Frontline employees must see real evidence of how this change will benefit them. Change is personal: everyone wants to know how his or her job will be impacted.

There should also be union representation in training activities, since workers tend to be more receptive to peer-to-peer support. Utilities should, for example, engage union change champions to help co-workers during training and to be site “go to” representatives. Utilities should also provide advance training and recognize all who participate in it.

Union representatives should also participate in design and/or testing, since they will be able to pinpoint issues that will impact routine daily tasks. It could be something as simple as changing screen labels per their recommendation to increase user understanding.

More than one union workforce may be involved in a project. Location cultures that exist in large service territories or that have resulted from mergers may try to isolate themselves from the project team and resist change. Utilities should assemble a team from various work groups and then do the following to address the history and differences in the workforce:

  • Request ongoing union participation throughout the life of the project.
  • Include union roles as part of the project charter and define these roles with union leadership.
  • Provide a kickoff overview to union leadership.
  • Include union representation in work process development with balanced representation from various areas. Union employees know the job and can quickly identify the pros and cons of work tasks. A structured facilitation process and issue resolution process is required.
  • Assign a corporate human resource or labor relations role to review processes that impact the union workforce.
  • Develop communication campaigns that address union concerns, such as conducting face-to-face presentations at employing locations and educating union leaders prior to each change rollout.
  • Involve union representatives in training and user support.

Change management is necessary to sort through the relationships of multiple union workforces so that projects and systems can be implemented.

AN AGING WORKFORCE

A successful change management program will help mitigate the aging workforce challenges utilities will be facing for many years to come.

WHAT TO EXPECT FROM A SUCCESSFUL CHANGE MANAGEMENT PROGRAM

The result of a successful change management program is a flexible organization that’s responsive to customer needs, regulatory mandates and market pressures, and readily embraces new technologies and systems. A change-ready organization anticipates, expects and is increasingly comfortable with change and exhibits the following characteristics:

  • The organization is aligned.
  • The leaders are committed.
  • Business processes are developed and defined across all operational units.
  • Associates at all levels have received communications and have continued access to resources.

Facing major business transformations and unique industry challenges, utilities cannot afford not to engage change management programs. This skill set is just as critical as any other role in your organization. Change is a cost. Change should be part of the project budget.

Change is an ongoing, long-term investment. Good change management designed specifically for your culture and challenges minimizes change’s adverse effect on daily productivity and helps you reach and sustain project goals.

ENDNOTES

  1. “Making Change Work” (an IBM study), Center of Evaluation and Methods, Bonn University, 2007; excerpts from “IBM Integrated Strategy and Change Methodology,” 2007.
  2. “Making Change Work,” Center of Evaluation and Methods, Bonn University, 2007.
  3. Ibid.
  4. T.J. Larkin and Sandar Larkin, “Communicating Change: Winning Employee Support for New Business Goals,” McGraw Hill, 1994, p. 31.
  5. K. Steenstrup, B. Williams, Z. Sumic, C. Moore; “Gartner’s Energy and Utilities Summit: Agility on Both Sides of the Divide”; Gartner Industry Research ID Number G00145388; Jan. 30, 2007; p. 2.
  6. P. R. Bruffy and J. Juliano, “Addressing the Aging Utility Workforce Challenge: ACT NOW,” Montgomery Research 2006 journal.