Alcatel-Lucent Your Smart Grid Partner

Alcatel-Lucent offers comprehensive capabilities that combine Utility industry – specific knowledge and experience with carrier – grade communications technology and expertise. Our IP/MPLS Transformation capabilities and Utility market – specific knowledge are the foundation of turnkey solutions designed to enable Smart Grid and Smart Metering initiatives. In addition, Alcatel-Lucent has specifically developed Smart Grid and Smart Metering applications and solutions that:

  • Improve the availability, reliability and resiliency of critical voice and data communications even during outages
  • Enable optimal use of network and grid devices by setting priorities for communications traffic according to business requirements
  • Meet NERC CIP compliance and cybersecurity requirements
  • Improve the physical security and access control mechanism for substations, generation facilities and other critical sites
  • Offer a flexible and scalable network to grow with the demands and bandwidth requirements of new network service applications
  • Provide secure web access for customers to view account, electricity usage and billing information
  • Improve customer service and experience by integrating billing and account information with IP-based, multi-channel client service platforms
  • Reduce carbon emissions and increase efficiency by lowering communications infrastructure power consumption by as much as 58 percent

Working with Alcatel-Lucent enables Energy and Utility companies to realize the increased reliability and greater efficiency of next-generation communications technology, providing a platform for, and minimizing the risks associated with, moving to Smart Grid solutions. And Alcatel-Lucent helps Energy and Utility companies achieve compliance with regulatory requirements and reductions in operational expenses while maintaining the security, integrity and high availability of their power infrastructure and services. We build Smart Networks to support the Smart Grid.

American Recovery and Reinvestment Act of 2009 Support from Alcatel-Lucent

The American Recovery and Reinvestment Act (ARRA) of 2009 was adopted by Congress in February 2009 and allocates $4.5 billion to the Department of Energy (DoE) for Smart Grid deployment initiatives. As a result of the ARRA, the DoE has established a process for awarding the $4.5 billion via investment grants for Smart Grid Research and Development, and Deployment projects. Alcatel-Lucent is uniquely qualified to help utilities take advantage of the ARRA Smart Grid funding. In addition to world-class technology and Smart Grid and Smart Metering solutions, Alcatel-Lucent offers turnkey assistance in the preparation of grant applications, and subsequent follow-up and advocacy with federal agencies. Partnership with Alcatel-Lucent on ARRA includes:

  • Design Implementation and support for a Smart Grid Network
  • Identification of all standardized and unique elements of each grant program
  • Preparation and Compilation of all required grant application components, such as project narratives, budget formation, market surveys, mapping, and all other documentation required for completion
  • Advocacy at federal, state, and local government levels to firmly establish the value proposition of a proposal and advance it through the entire process to ensure the maximum opportunity for success

Alcatel-Lucent is a Recognized Leader in the Energy and Utilities Market

Alcatel-Lucent is an active and involved leader in the Energy and Utility market, with active membership and leadership roles in key Utility industry associations, including the Utility Telecom Council (UTC), the American Public Power Association (APPA), and Gridwise. Gridwise is an association of Utilities, industry research organizations (e.g., EPRI, Pacific Northwest National Labs, etc.), and Utility vendors, working in cooperation with DOE to promote Smart Grid policy, regulatory issues, and technologies (see www.gridwise.org for more info). Alcatel-Lucent is also represented on the Board of Directors for UTC’s Smart Network Council, which was established in 2008 to promote and develop Smart Grid policies, guidelines, and recommended technologies and strategies for Smart Grid solution implementation.

Alcatel-Lucent IP MPLS Solution for the Next Generation Utility Network

Utility companies are experienced at building and operating reliable and effective networks to ensure the delivery of essential information and maintain flawless service delivery. The Alcatel-Lucent IP/MPLS solution can enable the utility operator to extend and enhance its network with new technologies like IP, Ethernet and MPLS. These new technologies will enable the utility to optimize its network to reduce both CAPEX and OPEX without jeopardizing reliability. Advanced technologies also allow the introduction of new Smart Grid applications that can improve operational and workflow efficiency within the utility. Alcatel-Lucent leverages cutting edge technologies along with the company’s broad and deep experience in the utility industry to help utility operators build better, next-generation networks with IP/MPLS.

Alcatel-Lucent has years of experience in the development of IP, MPLS and Ethernet technologies. The Alcatel-Lucent IP/MPLS solution offers utility operators the flexibility, scale and feature sets required for mission-critical operation. With the broadest portfolio of products and services in the telecommunications industry, Alcatel-Lucent has the unparalleled ability to design and deliver end-to-end solutions that drive next-generation utility networks.

About Alcatel-Lucent

Alcatel-Lucent’s vision is to enrich people’s lives by transforming the way the world communicates. As a leader in utility, enterprise and carrier IP technologies, fixed, mobile and converged broadband access, applications, and services, Alcatel-Lucent offers the end-to-end solutions that enable compelling communications services for people at work, at home and on the move.

With 77,000 employees and operations in more than 130 countries, Alcatel-Lucent is a local partner with global reach. The company has the most experienced global services team in the industry, and Bell Labs, one of the largest research, technology and innovation organizations focused on communications. Alcatel-Lucent achieved adjusted revenues of €17.8 billion in 2007, and is incorporated in France, with executive offices located in Paris.

Managing Communications Change

Change is being forced upon the utilities industry. Business drivers range from stakeholder pressure for greater efficiency to the changing technologies involved in operational energy networks. New technologies such as intelligent networks or smart grids, distribution automation or smart metering are being considered.

The communications network is becoming the key enabler for the evolution of reliable energy supply. However, few utilities today have a communications network that is robust enough to handle and support the exacting demands that energy delivery is now making.

It is this process of change – including the renewal of the communications network – that is vital for each utility’s future. But for the utility, this is a technological step change requiring different strategies and designs. It also requires new skills, all of which have been implemented in timescales that do not sit comfortably with traditional technology strategies.

The problems facing today’s utility include understanding the new technologies and assessing their capabilities and applications. In addition, the utility has to develop an appropriate strategy to migrate legacy technologies and integrate them with the new infrastructure in a seamless, efficient, safe and reliable manner.

This paper highlights the benefits utilities can realize by adopting a new approach to their customers’ needs and engaging a network partner that will take responsibility for the network upgrade, its renewal and evolution, and the service transition.

The Move to Smart Grids

The intent of smart grids is to provide better efficiency in the production, transport and delivery of energy. This is realized in two ways:

  • Better real-time control: ability to remotely monitor and measure energy flows more closely, and then manage those flows and the assets carrying them in real time.
  • Better predictive management: ability to monitor the condition of the different elements of the network, predict failure and direct maintenance. The focus is on being proactive to real needs prior to a potential incident, rather than being reactive to incidents, or performing maintenance on a repetitive basis whether it is needed or not.

These mechanisms imply more measurement points, remote monitoring and management capabilities than exist today. And this requires a greater reliance on reliable, robust, highly available communications than has ever been the case before.

The communications network must continue to support operational services independently of external events, such as power outages or public service provider failure, yet be economical and simple to maintain. Unfortunately, the majority of today’s utility communications implementations fall far short of these stringent requirements.

Changing Environment

The design template for the majority of today’s energy infrastructure was developed in the 1950s and 1960s – and the same is true of the associated communications networks.

Typically, these communications networks have evolved into a series of overlays, often of different technology types and generations (see Figure 1). For example, protection tends to use its own dedicated network. The physical realization varies widely, from tones over copper via dedicated time division multiplexing (TDM) connections to dedicated fiber connections. These generally use a mix of privately owned and leased services.

Supervisory control and data acquisitions systems (SCADA) generally still use modem technology at speeds between 300 baud to 9.6k baud. Again, the infrastructure is often copper or TDM running as one of many separate overlay networks.

Lastly, operational voice services (as opposed to business voice services) are frequently analog on yet another separate network.

Historically, there were good operational reasons for these overlays. But changes in device technology (for example, the evolution toward e-SCADA based on IP protocols), as well as the decreasing support by communications equipment vendors of legacy communications technologies, means that the strategy for these networks has to be reassessed. In addition, the increasing demand for further operational applications (for example, condition monitoring, or CCTV, both to support substation automation) requires a more up-to-date networking approach.

Tomorrow’s Network

With the exception of protection services, communications between network devices and the network control centers are evolving toward IP-based networks (see Figure 2). The benefits of this simplified infrastructure are significant and can be measured in terms of asset utilization, reduced capital and operational costs, ease of operation, and the flexibility to adapt to new applications. Consequently, utilities will find themselves forced to seriously consider the shift to a modern, homogeneous communications infrastructure to support their critical operational services.

Organizing For Change

As noted above, there are many cogent reasons to transform utility communications to a modern, robust communications infrastructure in support of operational safety, reliability and efficiency. However, some significant considerations should be addressed to achieve this transformation:

Network Strategy. It is almost inevitable that a new infrastructure will cross traditional operational and departmental boundaries within the utility. Each operational department will have its own priorities and requirements for such a network, and traditionally, each wants some, or total, control. However, to achieve real benefits, a greater degree of centralized strategy and management is required.

Architecture and Design. The new network will require careful engineering to ensure that it meets the performance-critical requirements of energy operations. It must maintain or enhance the safety and reliability of the energy network, as well as support the traffic requirements of other departments.

Planning, Execution and Migration. Planning and implementation of the core infrastructure is just the start of the process. Each service requires its own migration plan and has its own migration priorities. Each element requires specialist technical knowledge, and for preference, practical field experience.

Operation. Gone are the days when a communications failure was rectified by sending an engineer into the field to find the fault and to fix it. Maintaining network availability and robustness calls for sound operational processes and excellent diagnostics before any engineer or technician hits the road. The same level of robust centralized management tools and processes that support the energy networks have to be put in place to support communications network – no matter what technologies are used in the field.

Support. Although these technologies are well understood by the telecommunications industry, they are likely to be new to the energy utilities industry. This means that a solid support organization familiar with these technologies must be implemented. The evolution process requires an intense level of up-front skills and resources. Often these are not readily available in-house – certainly not in the volume required to make any network renewal or transformation effective. Building up this skill and resource base by recruitment will not necessarily yield staff that is aware of the peculiarities of the energy utilities market. As a result, there will be significant time lag from concept to execution, and considerable risk for the utility as it ventures alone into unknown territory.

Keys To Successful Engagement

Engaging a services partner does not mean ceding control through a rigid contract. Rather, it means crafting a flexible relationship that takes into consideration three factors: What is the desired outcome of the activity? What is the best balance of scope between partner assistance and in-house performance to achieve that outcome? How do you retain the flexibility to accommodate change while retaining control?

Desired outcome is probably the most critical element and must be well understood at the outset. For one utility, the desired outcome may be to rapidly enable the upgrade of the complete energy infrastructure without having to incur the upfront investment in a mass recruitment of the required new communications skills.

For other utilities, the desired outcome may be different. But if the outcomes include elements of time pressure, new skills and resources, and/or network transformation, then engaging a services partner should be seriously considered as one of the strategic options.

Second, not all activities have to be in scope. The objective of the exercise might be to supplement existing in-house capabilities with external expertise. Or, it might be to launch the activity while building up appropriate in-house resources in a measured fashion through the Build-Operate- Transfer (BOT) approach.

In looking for a suitable partner, the utility seeks to leverage not only the partner’s existing skills, but also its experience and lessons learned performing the same services for other utilities. Having a few bruises is not a bad thing – this means that the partner understands what is at stake and the range of potential pitfalls it may encounter.

Lastly, retaining flexibility and control is a function of the contract between the two parties which should be addressed in their earliest discussions. The idea is to put in place the necessary management framework and a robust change control mechanism based on a discussion between equals from both organizations. The utility will then find that it not only retains full control of the project without having to take day-to-day responsibility for its management, but also that it can respond to change drivers from a variety of sources – such as technology advances, business drivers, regulators and stakeholders.

Realizing the Benefits

Outsourcing or partnering the communications transformation will yield benefits, both tangible and intangible. It must be remembered that there is no standard “one-size-fits-all” outsourcing product. Thus, the benefits accrued will depend on the details of the engagement.

There are distinct tangible benefits that can be realized, including:

Skills and Resources. A unique benefit of outsourcing is that it eliminates the need to recruit skills not available internally. These are provided by the partner on an as-needed basis. The additional advantage for the utility is that it does not have to bear the fixed costs once they are no longer required.

Offset Risks. Because the partner is responsible for delivery, the utility is able to mitigate risk. For example, traditionally vendors are not motivated to do anything other than deliver boxes on time. But with a well-structured partnership, there is an incentive to ensure that the strategy and design are optimized to economically deliver the required services and ease of operation. Through an appropriate regime of business-related key performance indicators (KPIs), there is a strong financial incentive for the partner to operate and upgrade the network to maintain peak performance – something that does not exist when an in-house organization is used.

Economies of Scale. Outsourcing can bring the economies of scale resulting from synergies together with other parts of the partner’s business, such as contracts and internal projects.

There also are many other benefits associated with outsourcing that are not as immediately obvious and commercially quantifiable as those listed above, but can be equally valuable.

Some of these less tangible benefits include:

Fresh Point of View. Within most companies, employees often have a vested interest in maintaining the status quo. But a managed services organization has a vested interest in delivering the best possible service to the customer – a paradigm shift in attitude that enables dramatic improvements in performance and creativity.

Drive to Achieve Optimum Efficiency. Executives, freed from the day-to-day business of running the network, can focus on their core activities, concentrating on service excellence rather than complex technology decisions. To quote one customer, “From my perspective, a large amount of my time that might have in the past been dedicated to networking issues is now focused on more strategic initiatives concerned with running my business more effectively.”

Processes and Technologies Optimization. Optimizing processes and technologies to improve contract performance is part of the managed services package and can yield substantial savings.

Synergies with Existing Activities Create Economies of Scale. A utility and a managed services vendor have considerable overlap in the functions performed within their communications engineering, operations and maintenance activities. For example, a multi-skilled field force can install and maintain communications equipment belonging to a variety of customers. This not only provides cost savings from synergies with the equivalent customer activity, but also an improved fault response due to the higher density of deployed staff.

Access to Global Best Practices. An outsourcing contract relieves a utility of the time-consuming and difficult responsibility of keeping up to speed with the latest thinking and developments in technology. Alcatel-Lucent, for example, invests around 14 percent of its annual revenue into research and development; its customers don’t have to.

What Can Be Outsourced?

There is no one outsourcing solution that fits all utilities. The final scope of any project will be entirely dependent on a utility’s specific vision and current circumstances.

The following list briefly describes some of the functions and activities that are good possibilities for outsourcing:

Communications Strategy Consulting. Before making technology choices, the energy utility needs to define the operational strategy of the communications network. Too often communications is viewed as “plug and play,” which is hardly ever the case. A well-thought-out communications strategy will deliver this kind of seamless operation. But without that initial strategy, the utility risks repeating past mistakes and acquiring an ad-hoc network that will rapidly become a legacy infrastructure, which will, in turn, need replacing.

Design. Outsourcing allows utilities to evolve their communications infrastructure without upfront investment in incremental resources and skills. It can delegate responsibility for defining network architecture and the associated network support systems. A utility may elect to leave all technological decisions to the vendor and merely review progress and outcomes. Or, it may retain responsibility for technology strategy, and turn to the managed services vendor to turn the strategy into architecture and manage the subsequent design and project activities.

Build. Detailed planning of the network, the rollout project and the delivery of turnkey implementations all fall within the scope of the outsourcing process.

Operate, Administer and Maintain. Includes network operations and field and support services:

  • Network Operations. A vendor such as Alcatel-Lucent has the necessary experience in operating Network Operations Centers (NOCs), both on a BOT and ongoing basis. This includes handling all associated tasks such as performance and fault monitoring, and services management.
  • Network and Customer Field Services. Today, few energy utilities consider outside maintenance and provisioning activities to be a strategic part of their business and recognize they are prime candidates for outsourcing. Activities that can be outsourced include corrective and preventive maintenance, network and service provisioning, and spare parts management, return and repair – in other words, all the daily, time-consuming, but vitally important elements for running a reliable network.
  • Network Support Services. Behind the first-line activities of the NOC are a set of engineering support functions that assist with more complex faults – these are functions that cannot be automated and tend to duplicate those of the vendor’s. The integration and sharing of these functions enabled by outsourcing can significantly improve the utility’s efficiency.

Conclusion

Outsourcing can deliver significant benefits to a utility, both in terms of its ability to invest in and improve its operation and associated costs. However, each utility has its own unique circumstances, specific immediate needs, and vision of where it is going. Therefore, each technical and operational solution is different.

Silver Spring Networks

When engineers built the national electric grid, their achievement made every other innovation built on or run by electricity possible – from the car and airplane to the radio, television, computer and the Internet. Over decades, all of these inventions have gotten better, smarter and cheaper while the grid has remained exactly the same. As a result, our electrical grid is operating under tremendous stress. The Department of Energy estimates that by 2030, demand for power will outpace supply by 30 percent. And this increasing demand for low-cost, reliable power must be met alongside growing environmental concerns.

Silver Spring Networks (SSN) is the first proven technology to enable the smart grid. SSN is a complete smart grid solutions company that enables utilities to achieve operational efficiencies, reduce carbon emissions and offer their customers new ways to monitor and manage their energy consumption. SSN provides hardware, software and services that allow utilities to deploy and run unlimited advanced applications, including smart metering, demand response, distribution automation and distributed generation, over a single, unified network.

The smart grid should operate like the Internet for energy, without proprietary networks built around a single application or device. In the same way that one can plug any laptop or device into the Internet, regardless of its manufacturer, utilities should be able to “plug in” any application or consumer device to the smart grid. SSN’s Smart Energy Network is based on open, Internet Protocol (IP) standards, allowing for continuous, two-way communication between the utility and every device on the grid – now and in the future.

The IP networking standard adopted by Federal agencies has proven secure and reliable over decades of use in the information technology and finance industries. This network provides a high-bandwidth, low-latency and cost-effective solution for utility companies.

SSN’s Infrastructure Cards (NICs) are installed in “smart” devices, like smart meters at the consumer’s home, allowing them to communicate with SSN’s access points. Each access point communicates with networked devices over a radius of one or two miles, creating a wireless communication mesh that connects every device on the grid to one another and to the utility’s back office.

Using the Smart Energy Network, utilities will be able to remotely connect or disconnect service, send pricing information to customers who can understand how much their energy is costing in real time, and manage the integration of intermittent renewable energy sources like solar panels, plug-in electric vehicles and wind farms.

In addition to providing The Smart Energy Network and the software/firmware that makes it run smoothly, SSN develops applications like outage detection and restoration, and provides support services to their utility customers. By minimizing or eliminating interruptions, the self-healing grid could save industrial and residential consumers over $100 billion per year.

Founded in 2002 and headquartered in Redwood City, Ca., SSN is a privately held company backed by Foundation Capital, Kleiner Perkins Caufield & Byers and Northgate Capital. The company has over 200 employees and a global reach, with partnerships in Australia, the U.K. and Brazil.

SSN is the leading smart grid solutions provider, with successful deployments with utilities serving 20 percent of the U.S. population, including Florida Power & Light (FPL), Pacific Gas & Electric (PG&E), Oklahoma Gas & Electric (OG&E) and Pepco Holdings, Inc. (PHI), among others.

FPL is one of the largest electric utilities in the U.S., serving approximately 4.5 million customers across Florida. In 2007, SSN and FPL partnered to deploy SSN’s Smart Energy Network to 100,000 FPL customers. It began with rigorous environmental and reliability testing to ensure that SSN’s technology would hold up under the harsh environmental conditions in some areas of Florida. Few companies are able to sustain the scale and quality of testing that FPL required during this deployment, including power outage notification testing, exposure to water and salt spray and network throughput performance test for self-healing failover characteristics.

SSN’s solution has met or exceeded all FPL acceptance criteria. FPL plans to continue deployment of SSN’s Smart Energy Network at a rate of one million networked meters per year beginning in 2010 to all 4.5 million residential customers.

PG&E is currently rolling out SSN’s Smart Energy Network to all 5 million electric customers over a 700,000 square-mile service area.

OG&E, a utility serving 770,000 customers in Oklahoma and western Arkansas, worked with SSN to deploy a small-scale pilot project to test The Smart Energy Network and gauge customer satisfaction. The utility deployed SSN’s network, along with an energy management web-based portal in 25 homes in northwest Oklahoma City. Another 6,600 apartments were given networked meters to allow remote initiation and termination of service.

Consumer response to the project was overwhelmingly positive. Participating residents said they gained flexibility and control over their household’s energy consumption by monitoring their usage on in-home touch screen information panels. According to one customer, “It’s the three A’s: awareness, attitude and action. It increased our awareness. It changed our attitude about when we should be using electricity. It made us take action.”

Based on the results, OG&E presented a plan for expanded deployment to the Oklahoma Corporation Commission for their consideration.

PHI recently announced its partnership with SSN to deliver The Smart Energy Network to its 1.9 million customers across Washington, D.C., Delaware, Maryland and New Jersey. The first phase of the smart grid deployment will begin in Delaware in March 2009 and involve SSN’s advanced metering and distribution automation technology. Additional deployment will depend on regulatory authorization.

The impact of energy efficiency is enormous. More aggressive energy efficiency efforts could cut the growth rate of worldwide energy consumption by more than half over the next 15 years, according to the McKinsey Global Institute. The Brattle Group states that demand response could reduce peak load in the U.S. by at least 5 percent over the next few years, saving over $3 billion per year in electricity costs. The discounted present value of these savings would be $35 billion over the next 20 years in the U.S. alone, with significantly greater savings worldwide.

Governments throughout the EU, Canada and Australia are now mandating implementation of alternate energy and grid efficiency network programs. The Smart Energy Network is the technology platform that makes energy efficiency and the smart grid possible. And, it is working in the field today.

Measuring Smart Metering’s Progress

Smart or advanced electricity metering, using a fixed network communications path, has been with us since pioneering installations in the US Midwest in the mid-1980s. That’s 25 years ago, during which time we have seen incredible advancements in information and communication technologies.

Remember the technologies of 1985? The very first mobile phones were just being introduced. They weighed as much as a watermelon and cost nearly $9,000 in today’s dollars. SAP had just opened its first sales office outside of Germany, and Oracle had fewer than 450 employees. The typical personal computer had a 10 megabyte hard drive, and a dot-com Internet domain was just a concept.

We know how much these technologies have changed since then, how they have been embraced by the public, and (to some degree at least) where they are going in the future. This article looks at how smart metering technology has developed over the same period. What has been the catalyst for advancements? And, most important, what does that past tell us about the future of smart metering?

Peter Drucker once said that “trying to predict the future is like trying to drive down a country road at night with no lights while looking out the back window.”

Let’s take a brief look out the back window, before driving forward.

Past Developments

Developments in the parallel field of wireless communications, with its strong standards base, are readily delineated into clear technology generations. While we cannot as easily pinpoint definitive phases of smart metering technology, we can see some major transitions and discern patterns from the large deployments illustrated in Figure 1, and perhaps, even identify three broad smart metering “generations.”

The first generation is probably the clearest to delineate. The first 10 years of smart metering deployments (until about 2004) were all one-way wireless, limited two-way wireless, or very low-bandwidth power-line carrier communications (PLC) to the meter, concentrated in the U.S. The market at this time was dominated by Distribution Control Systems, Inc. (DCSI) and, what was then, CellNet Data Systems, Inc. Itron Fixed Network 2.0 and Hunt Technologies’ TS1 solution would also fit into this generation.

More than technology, the strongest characteristic of this first generation is the limited scope of business benefits considered. With the exception of Puget Sound Energy’s time-of-use pricing program, the business case for these early deployments was focused almost exclusively on reducing meter reading costs. Effectively, these early deployments reproduced the same business case as mobile automated meter reading (AMR).

By 2004, approximately 10 million of these smart meters had been installed in the U.S. (about 7 percent of the national total); however, whatever public perception of smart metering there was at the time was decidedly mixed. The deployments received scant media coverage, which focused almost solely on troubled time-of-use pricing programs, perhaps digressing briefly to cover smart metering vendor mergers and lawsuits. But generally smart meters, by any name, were unknown among the general population.

Today’s Second Generation

By the early 2000s, some utilities, notably PPL and PECO, both in Pennsylvania, were beginning to expand the use of their smart metering infrastructure beyond the simple meter-to-cash process. With incremental enhancements to application integration that were based on first generation technology, they were initiating projects to use smart metering to: transform outage identification and response; explore more frequent reading and more granular data; and improve theft detection.

These initiatives were the first to give shape to a new perspective on smart metering, but it was power company Enel’s dramatic deployment of 30 million smart meters across Italy that crystallized the second generation.

For four years leading to 2005, Enel fully deployed key technology advancements, such as universal and integrated remote disconnect and load limiting, that previously did not exist on any real scale. These changes enabled a dramatically broader scope of business benefits as this was the first fully deployed solution designed from the ground up to look well beyond reducing meter reading costs.

The impact of Enel’s deployment and subsequent marketing campaign on smart metering developments in other countries should not be underestimated, particularly among politicians and regulators outside the U.S. In European countries, particularly Italy, and regions such as Scandinavia, the same model (and in many cases the same technology) was deployed. Enel demonstrated to the rest of the world what could be done without any high-profile public backlash. It set a competitive benchmark that had policymakers in other countries questioning progress in their jurisdictions and challenging their own utilities to achieve the same.

North American Resurgence

As significant as Enel’s deployment was on the global development of smart metering, it is not the basis for today’s ongoing smart metering technology deployments now concentrated in North America.

More than the challenges of translating a European technology to North America, the business objectives and customer environments were different. As the Enel deployment came to an end, governments and regulators – particularly those in California and Ontario – were looking for smart metering technology to be the foundation for major energy conservation and peak-shifting programs. They expected the technology to support a broad range of pricing programs, provide on-demand reads within minutes, and gather hourly interval profile data from every meter.

Utilities responded. Pacific Gas & Electric (PG&E), with a total of 9 million electric and natural gas meters, kick-started the movement. Others, notably Southern California Edison (SCE), invested the time and effort to advance the technology, championing additions such as remote firmware upgrades and home area network support.

As a result, a near dormant North American smart metering market was revived in 2007. The standard functionality we see in most smart metering specifications today and the technology basis for most planned deployments in North America was established.

These technology changes also contributed to a shift in public awareness of smart meters. As smart metering was considered by more local utilities, and more widely associated with growing interest in energy conservation, media interest grew exponentially. Between 2004 and 2008, references to smart or advanced meters (carefully excluding smart parking meters) in the world’s major newspapers nearly doubled every year, to the point where the technology is now almost common knowledge in many countries.

The Coming Third Generation

In the 25 years since smart meters were first substantially deployed, the technology has progressed considerably. While progress has not been as rapid as advancements in consumer communications technologies, smart metering developments such as universal interval data collection, integrated remote disconnect and load limiting, remote firmware upgrades and links to a home network are substantial advancements.

All of these advancements have been driven by the combination of forward-thinking government policymakers, a supportive regulator and, perhaps most important, a large utility willing to invest the time and effort to understand and demand more from the vendor community.

With this understanding of the drivers, and based on the technology deployment plans, we can map out key future smart metering technology directions. We expect to see the next generation of smart metering exhibit two dominant differences from today’s technology. This includes increased standardization across the entire smart metering solution scope and changes to back-office systems architecture that enables the extended benefits of smart metering.

Increased Standardization

The transition to the next generation of smart metering will be known more for its changes to how a smart meter works, rather than what a smart meter does.

The direct functions of a smart meter appear to be largely set. We expect to see continued incremental advancements in data quality and read reliability; improved power quality measurement; and more universal deployment of a remote disconnect and load limiting.

But how a smart meter provides these functions will further change. We believe the smart meter will become a much more integrated part of two networks: one inside the home; the other along the electricity distribution network.

Generally, an expectation of standards for communication from the meter into a home area network is well accepted by the industry – although the actual standard to be applied is still in question. As this home area network develops, we expect a smart meter to increasingly become a member of this network, rather than the principal mechanism in creating one.

As other smart grid devices are deployed further down the low voltage distribution system, we expect utilities to demand that the meter conform to these network communications standards. In other words, utilities will continue to reject the idea that other types of smart grid devices – those with even greater control of the electrical network – be incorporated into a proprietary smart meter local area network.

It appears that most of this drive to standardization will not be led by utilities in North America. For one, technology decisions in North America are rapidly being completed (for this first round of replacements, at least). The recent Federal Regulatory Energy Commission (FERC) staff report, entitled “2008 Assessment of Demand Response and Advanced Metering” found that of the 145 million meters in the U.S., utilities have already contracted to replace nearly 52 million with smart meters over the next five to seven years.

IBM’s analysis indicated that larger utilities have declared plans to replace these meters even faster – approximately 33 million smart meters by 2013. The meter communications approach, and quite often the vendors chosen for these deployments, has typically already been selected, leaving little room to fundamentally change the underlying technological approach.

Outside of Worldwide Interoperability for Microwave Access (WiMAX) experiments by utilities such as American Electric Power (AEP) and those in Ontario, and shared services initiatives in Texas and Ontario, none of the remaining large North American utilities appear to have a compelling need to drive dramatic technology advancements, given rate and time pressures from regulators.

Conversely, a few very large European programs are poised to push the technology toward much greater standards adoption:

  • EDF in France has started a trial of 300,000 meters following standard PLC communications from the meter to the concentrator. The full deployment to all 35 million EDF meters is expected to follow.
  • The U.K. government recently announced a mandatory replacement of both electricity and natural gas meters for all 46 million customers between 2010 and 2020. The U.K.’s unique market structure with competitive retailers having responsibility for meter ownership and operation is driving interoperability standards beyond currently available technology.
  • With its PRIME initiative, the Spanish utility Iberdrola plans to develop a new PLC-based, open standard for smart metering. It is starting with a pilot project in 2009, leading to full deployment to more than 10 million residential customers.

The combination of these three smart metering projects alone will affect 91 million smart meters, equal to two thirds of the total U.S. market. This European focus is expected to grow now that the Iberdrola project has taken the first steps to be the basis for the European Commission’s Open Meter initiative, involving 19 partners from seven European countries.

Rethinking Utility System Architectures

Perhaps the greatest changes to future smart metering systems will have nothing to do with the meter itself.

To date, standard utility applications for customer care and billing, outage management, and work management have been largely unchanged by smart metering. In fact, to reduce risk and meet schedules, utilities have understandably shielded legacy systems from the changes needed to support a smart meter rollout or new tariffs. They have looked to specialized smart metering systems, particularly meter data management systems (MDMS), to bridge the gap between a new smart metering infrastructure and their legacy systems.

As a result, many of the potential benefits of a smart metering infrastructure have yet to be fully realized. For instance, billing systems still operate on cycles set by past meter reading routes. Most installed outage management applications are unable to take advantage of a direct near-real-time connection to nearly every end point.

As application vendors catch up, we expect the third generation of smart meters to be characterized by changes to the overall utility architectures and the applications that comprise them. As applications are enhanced, and enterprise architectures adapted to the smart grid, we expect to see significant architectural changes, such as:

  • Much of the message brokering functions from disparate head-end systems to utility applications in an MDMS will migrate to the utility’s service bus.
  • As smart meters increasingly become devices on a standards-based network, more general network management applications now widely deployed for telecommunications networks will supplement vendor head-end systems.
  • Complex estimating and editing functions will become less valuable as the technology in the field becomes more reliable.
  • Security of the system, from home network to the utility firewall, needs to meet the much higher standards associated with grid operations, rather than those arising from the current meter-as-the-cash-register perspective.
  • Add-on functionality provided by some niche vendors will migrate to larger utility systems as they evolve to a smart metering world. For instance, Web presentment of interval data to customers will move from dedicated sites to become a broad part of utilities’ online offerings.

Conclusions

Looking back at 25 years of smart metering technology development, we can see that while it has progressed, it has not developed at the pace of the consumer communications and computing technologies they rely upon – and for good reasons.

Utilities operate under a very different investment timeframe compared to consumer electronics; decisions made by utilities today need to stand for decades, rather than mere months. While consumer expectations of technology and service continue to grow with each generation, in the regulated electricity distribution industry, any customer demands are often filtered through a blurry political and regulatory lens.

Even with these constraints, smart metering technology has evolved rapidly, and will continue to change in the future. The next generation, with increased standardized integration with other networks and devices, as well as changes to back office systems, will certainly transform what we now call smart metering. So much so, that much sooner than 25 years from now, those looking back at today’s smart meters may very well see them as we now see those watermelon-sized cell phones of the 1980’s.

Modeling Distribution Demand Reduction

In the past, distribution demand reduction was a technique used only in emergency situations a few times a year – if that. It was an all-or-nothing capability that you turned on, and hoped for the best until the emergency was over. Few utilities could measure the effectiveness, let alone the potential of any solutions that were devised.

Now, demand reduction is evolving to better support the distribution network during typical peaking events, rather than just emergencies. However, in this mode, it is important not only to understand the solution’s effectiveness, but to be able to treat it like any other dispatchable load-shaping resource. Advanced modeling techniques and capabilities are allowing utilities to do just that. This paper outlines various methods and tools that allow utilities to model distribution demand reduction capabilities within set time periods, or even in near real time.

Electricity demand continues to outpace the ability to build new generation and apply the necessary infrastructure needed to meet the ever-growing, demand-side increases dictated by population growth and smart residences across the globe. In most parts of the world, electrical energy is one of the most important characteristics of a modern civilization. It helps produce our food, keeps us comfortable, and provides lighting, security, information and entertainment. In short, it is a part of almost every facet of life, and without electrical energy, the modern interconnected world as we know it would cease to exist.

Every country has one or more initiatives underway, or in planning, to deal with some aspect of generation and storage, delivery or consumption issues. Additionally, greenhouse gases (GHG) and carbon emissions need to be tightly controlled and monitored. This must be carefully balanced with expectations from financial markets that utilities deliver balanced and secure investment portfolios by demonstrating fiduciary responsibility to sustain revenue projections and measured growth.

The architects of today’s electric grid probably never envisioned the day when electric utility organizations would purposefully take measures to reduce the load on the network, deal with highly variable localized generation and reverse power flows, or anticipate a regulatory climate that impacts the decisions for these measures. They designed the electric transmission and distribution systems to be robust, flexible and resilient.

When first conceived, the electric grid was far from stable and resilient. It took growth, prudence and planning to continue the expansion of the electric distribution system. This grid was made up of a limited number of real power and reactive power devices that responded to occasional changes in power flow and demand. However, it was also designed in a world with far fewer people, with a virtually unlimited source of power, and without much concern or knowledge of the environmental effects that energy production and consumption entail.

To effectively mitigate these complex issues, a new type of electric utility business model must be considered. It must rapidly adapt to ever-changing demands in terms of generation, consumption, environmental and societal benefits. A grid made up of many intelligent and active devices that can manage consumption from both the consumer and utility side of the meter must be developed. This new business model will utilize demand management as a key element to the operation of the utility, while at the same time driving the consumer spending behavior.

To that end, a holistic model is needed that understands all aspects of the energy value chain across generation, delivery and consumption, and can optimize the solution in real time. While a unifying model may still be a number of years away, a lot can be gained today from modeling and visualizing the distribution network to gauge the effect that demand reduction can – and does – play in near real time. To that end, the following solutions are surely well considered.

Advanced Feeder Modeling

First, a utility needs to understand in more detail how its distribution network behaves. When distribution networks were conceived, they were designed primarily with sources (the head of the feeder and substation) and sinks (the consumers or load) spread out along the distribution network. Power flows were assumed to be one direction only, and the feeders were modeled for the largest peak level.

Voltage and volt-ampere reactive power (VAR) management were generally considered for loss optimization and not load reduction. There was never any thought given to limiting power to segments of the network or distributed storage or generation, all of which could dramatically affect the flow of the network, even causing reverse flows at times. Sensors to measure voltage and current were applied at the head of the feeder and at a few critical points (mostly in historical problem areas.)

Planning feeders at most utilities is an exercise performed when large changes are anticipated (i.e., a new subdivision or major customer) or on a periodic basis, usually every three to five years. Loads were traditionally well understood with predictable variability, so this type of approach worked reasonably well. The utility also was in control of all generation sources on the network (i.e., peakers), and when there was a need for demand reduction, it was controlled by the utility, usually only during critical periods.

Today’s feeders are much more complex, and are being significantly influenced by both generation and demand from entities outside the control of the utility. Even within the utility, various seemingly disparate groups will, at times, attempt to alter power flows along the network. The simple model of worst-case peaking on a feeder is not sufficient to understand the modern distribution network.

The following factors must be considered in the planning model:

  • Various demand-reduction techniques, when and where they are applied and the potential load they may affect;
  • Use of voltage reduction as a load-shedding technique, and where it will most likely yield significant results (i.e., resistive load);
  • Location, size and capacity of storage;
  • Location, size and type of renewable generation systems;
  • Use and location of plug-in electrical vehicles;
  • Standby generation that can be fed into the network;
  • Various social ecosystems and their characteristics to influence load; and
  • Location and types of sensors available.

Generally, feeders are modeled as a single unit with their power characteristic derived from the maximum peaking load and connected kilovolt-amperage (KVA) of downstream transformers. A more advanced model treats the feeder as a series of connected segments. The segment definitions can be arbitrary, but are generally chosen where the utility will want to understand and potentially control these segments differently than others. This may be influenced by voltage regulation, load curtailment, stability issues, distributed generation sources, storage, or other unique characteristics that differ from one segment to the next.

The following serves as an advanced means to model the electrical distribution feeder networks. It provides for segmentation and sensor placement in the absence of a complete network and historical usage model. The modeling combines traditional electrical engineering and power-flow modeling with tools such as CYME and non-traditional approaches using geospatial and statistical analysis.

The model builds upon information such as usage data, network diagrams, device characteristics and existing sensors. It then adds elements that could present a discrepancy with the known model such as social behavior, demand-side programs, and future grid operations based on both spatio-temporal and statistical modeling. Finally, suggestions can be made about sensors’ placement and characteristics to the network to support system monitoring once in place.

Generally, a utility would take a more simplistic view of the problem. It would start by directly applying statistical analysis and stochastic modeling across the grid to develop a generic methodology for selecting the number of sensors, and where to place them based on sensor accuracy, cost and risk-of-error introduction from basic modeling assumptions (load allocation, timing of peak demand, and other influences on error.) However, doing so would limit the utility, dealing only with the data it has in an environment that will be changing dramatically.

The recommended and preferred approach performs some analysis to determine what the potential error sources are, which source is material to the sensor question, and which could influence the system’s power flows. Next, an attempt can be made to geographically characterize where on the grid these influences are most significant. Then, a statistical approach can be applied to develop a model for setting the number, type and location of additional sensors. Lastly sensor density and placement can be addressed.

Feeder Modeling Technique

Feeder conditioning is important to minimize the losses, especially when the utility wants to moderate voltage levels as a load modification method. Without proper feeder conditioning and sufficient sensors to monitor the network, the utility is at risk of either violating regulatory voltage levels, or potentially limiting its ability to reduce the optimal load amount from the system during voltage reduction operations.

Traditionally, feeder modeling is a planning activity that is done at periodic (for example, yearly) intervals or during an expected change in usage. Tools such as CYME – CYMDIST provide feeder analysis using:

  • Balanced and unbalanced voltage drop analysis (radial, looped or meshed);
  • Optimal capacitor placement and sizing to minimize losses and/or improve voltage profile;
  • Load balancing to minimize losses;
  • Load allocation/estimation using customer consumption data (kWh), distribution transformer size (connected kVA), real consumption (kVA or kW) or the REA method. The algorithm treats multiple metering units as fixed demands; and large metered customers as fixed load;
  • Flexible load models for uniformly distributed loads and spot loads featuring independent load mix for each section of circuit;
  • Load growth studies for multiple years; and
  • Distributed generation.

However, in many cases, much of the information required to run an accurate model is not available. This is either because the data does not exist, the feeder usage paradigm may be changing, the sampling period does not represent a true usage of the network, the network usage may undergo significant changes, or other non-electrical characteristics.

This represents a bit of a chicken-or-egg problem. A utility needs to condition its feeders to change the operational paradigm, but it also needs operational information to make decisions on where and how to change the network. The solution is a combination of using existing known usage and network data, and combining it with other forms of modeling and approximation to build the best future network model possible.

Therefore, this exercise refines traditional modeling with three additional techniques: geospatial analysis; statistical modeling; and sensor selection and placement for accuracy.

If a distribution management system (DMS) will be deployed, or is being considered, its modeling capability may be used as an additional basis and refinement employing simulated and derived data from the above techniques. Lastly, if high accuracy is required and time allows, a limited number of feeder segments can be deployed and monitored to validate the various modeling theories prior to full deployment.

The overall goals for using this type of technique are:

  • Limit customer over or under voltage;
  • Maximize returned megawatts in the system in load reduction modes;
  • Optimize the effectiveness of the DMS and its models;
  • Minimize cost of additional sensors to only areas that will return the most value;
  • Develop automated operational scenarios, test and validation prior to system-wide implementation; and
  • Provide a foundation for additional network automation capabilities.

The first step starts by setting up a short period of time to thoroughly vet possible influences on the number, spacing and value offered by additional sensors on the distribution grid. This involves understanding and obtaining information that will most influence the model, and therefore, the use of sensors. Information could include historical load data, distribution network characteristics, transformer name plate loading, customer survey data, weather data and other related information.

The second step is the application of geospatial analysis to identify areas of the grid most likely to have influences driving a need for additional sensors. It is important to recognize that within this step is a need to correlate those influential geospatial parameters with load profiles of various residential and commercial customer types. This step represents an improvement over simply applying the same statistical analysis generically over the entirety of the grid, allowing for two or more “grades” of feeder segment characteristics for which different sensor standards would be developed.

The third step is the statistical analysis and stochastic modeling to develop recommended standards and methodology for determining sensor placement based on the characteristic segments developed from the geospatial assessment. Items set aside as not material for sensor placement serve as a necessary input to the coming “predictive model” exercise.

Lastly, a traditional electrical and accuracy- based analysis is used to model the exact number and placement of additional sensors to support the derived models and planned usage of the system for all scenarios depicted in the model – not just summertime peaking.

Conclusion

The modern distribution network built for the smart grid will need to undergo significantly more detailed planning and modeling than a traditional network. No one tool is suited to the task, and it will take multiple disciplines and techniques to derive the most benefit from the modeling exercise. However, if a utility embraces the techniques described within this paper, it will not only have a better understanding of how its networks perform in various smart grid scenarios, but it will be better positioned to fully optimize its networks for load and loss optimization.

Future of Learning

The nuclear power industry is facing significant employee turnover, which may be exacerbated by the need to staff new nuclear units. To maintain a highly skilled workforce to safely operate U.S. nuclear plants, the industry must find ways to expedite training and qualification, enhance knowledge transfer to the next generation of workers, and develop leadership talent to achieve excellent organizational effectiveness.

Faced with these challenges, the Institute of Nuclear Power Operations (INPO), the organization charged with promoting safety and reliability across the 65 nuclear electric generation plants operating in the U.S., created a “Future of Learning” initiative. It identified ways the industry can maintain the same high standard of excellence and record of nuclear safety, while accelerating training development, individual competencies and plant training operations.

The nuclear power industry is facing the perfect storm. Like much of the industrialized world, it must address issues associated with an aging workforce since many of its skilled workers and nuclear engineering professionals are hitting retirement age, moving out of the industry and beginning other pursuits.

Second, as baby boomers transition out of the workforce, they will be replaced by an influx of Generation Y workers. Many workers in this “millenials” generation are not aware of the heritage driving the single-minded focus on safety. They are asking for new learning models, utilizing the technologies which are so much a part of their lives.

Third, even as this big crew change takes place, there is increasing demand for electricity. Many are turning to cleaner technologies – solar, wind, and nuclear – to close the gap. And there is resurgence in requests for building new nuclear plants, or adding new reactors at existing plants. This nuclear renaissance also requires training and preparation to take on the task of safely and reliably operating our nuclear power plants.

It is estimated there will be an influx of 25,000 new workers in the industry over the next five years, with an additional 7,000 new workers needed if just a third of the new plants are built. Given that incoming workers are more comfortable using technology for learning, and that delivery models that include a blend of classroom-based, instructor-led, and Web-based methods can be more effective and efficient, the industry is exploring new models and a new mix of training.

INPO was created by the nuclear industry in 1979 following the Three Mile Island accident. It has 350 full-time and loaned employees. As a nonprofit organization, it is chartered to promote the highest levels of safety and reliability – in essence, to promote excellence – in the operation of nuclear electric generating plants. All U.S. nuclear operating companies are members.

INPO’s responsibilities include evaluating member nuclear site operations, accrediting each site’s nuclear training programs and providing assistance and information exchange. It has established the National Academy for Nuclear Training, and an independent National Nuclear Accrediting Board. INPO sends teams to sites to evaluate their respective training activities, and each station is reviewed at least every four years by the accrediting board.

INPO has developed guidelines for 12 specifically accredited programs (six operations and six maintenance/technical), including accreditation objectives and criteria. It also offers courses and seminars on leadership, where more than 1,500 individuals participate annually, from supervisors to board members. Lastly, it operates NANTeL (National Academy for Nuclear Training e-Learning system) with 200 courses for general employee training for nuclear access. More than 80,000 nuclear workers and sub-contractors have completed training over the Web.

The Future of Learning

In 2008, to systematically address workforce and training challenges, the INPO Future of Learning team partnered with IBM Workforce and Learning Solutions to conduct more than 65 one-on-one interviews, with chief executive officers, chief nuclear officers, senior vice presidents, plant managers, plant training managers and other leaders in the extended industry community. The team also completed 46 interviews with plant staff during a series of visits to three nuclear power plants. Lastly, the team developed and distributed a survey that was sent to training managers at the 65 nuclear plants, achieving a 62 percent response rate.

These are statements the team heard:

  • “Need to standardize a lot of the training, deliver it remotely, preferably to a desktop, minimize the ‘You train in our classroom in our timeframe’ and have it delivered more autonomously so it’s likely more compatible with their lifestyles.”
  • “We’re extremely inefficient today in how we design/develop and administer training. We don’t want to carry inefficiencies that we have today into the future.”
  • “Right now, in all training programs, it’s a one-size-fits-all model that’s not customized to an individual’s background. Distance learning would enable this by allowing people to demonstrate knowledge and let some people move at a faster pace.”
  • “We need to have ‘real’ e-learning. We’ve been exposed to less than adequate, older models of e-learning. We need to move away from ‘page turners’ and onto quality content.”

Several recommendations were generated as a result of the study. The first focused on ways to improve INPO’s current training offerings by adding leadership development courses, ratcheting up the interactivity of the Web-based and e-learning offerings in NANTeL and developing a “nuclear citizenship” course for new workers in the industry.

Second, there were recommendations about better utilizing training resources across the industry by centralizing common training, beginning with instructor training and certification and generic fundamentals courses. It was estimated that 50 percent of the accredited training materials are common across the industry. To accomplish this objective, INPO is exploring an industry infrastructure that would enable centralized training material development, maintenance and delivery.

The last set of recommendations focused on methods for better coordination and efficiency of training, including developing processes for certifying vendor training programs, and providing a jump-start to common community college and university curriculum.

In 2009, INPO is piloting a series of Future of Learning initiatives which will help determine the feasibility, cost-effectiveness, readiness and acceptance of this first set of recommendations. It is starting to look more broadly at ways it can utilize learning technology to drive economies of scale, accelerative and prescriptive learning, and deliver value to the nuclear electric generation industry.

Where Do We Go From Here ?

Beyond the initial perfect storm is another set of factors driving the future of learning.

First, consider the need for speed. It has been said that “If you are not learning at the speed of change, you are falling behind.”

In his “25 Lessons from Jack Welch,” the former CEO of General Electric said, “The desire, and the ability, of an organization to continuously learn from any source, anywhere – and to rapidly convert this learning into action – is its ultimate competitive advantage.” Giving individuals, teams and organizations the tools and technologies to accelerate and broaden their learning is an important part of the future of learning.

Second, consider the information explosion – the sheer volume of information available, the convenience of information access (due, in large part, to continuing developments in technology) and the diversity of information available. When there is too much information to digest, a person is unable to locate and make use of the information that one needs. When one is unable to process the sheer volume of information, overload occurs. The future of learning should enable the learner to sort through information and find knowledge.

Third, consider new developments in technology. Generations X and Y are considered “digital natives.” They expect that the most current technologies are available to them – including social networking, blogging, wikis, immersive learning and gaming – and to not have them is unthinkable.

Impact of New Technology

Philosophy of training has morphed from “just-in-case” (teach them everything and hope they will remember when they need it), to “just-in-time” (provide access to training just before the point of need), to “just-for-me.” With respect to the latter, learning is presented in a preferred media, with a learning path customized to reflect the student’s preferred learning style, and personalized to address the current and desired level of expertise within any given time constraint.

Imagine a scenario in which a maintenance technician at a nuclear plant has to replace a specialized valve – something she either hasn’t done for awhile, or hasn’t replaced before. In a Web 2.0 world, she should be able to run a query on her iPhone or similar handheld device and pull up the maintenance of that particular valve, access the maintenance records, view a video of the approved replacement procedure, or access an expert who could coach her through the process.

Learning Devices

What needs to be in place to enable this vision of the future of learning? First, workers will need a device that can access the information by connecting over a secure wireless network inside the plant. Second, the learning has to be available in small chunks – learning nuggets or learning assets. Third, the learning needs to be assembled along the dimensions of learning style, desired and target level of expertise, time available and media type, among other factors. Finally, experts need to be identified, tagged to particular tasks and activities, and made accessible.

Fortunately, some of the same learning technology tools that will enable centralized maintenance and accelerated development will also facilitate personalized learning. When training is organized at a more granular level – the learning asset level – not only can it be leveraged over a variety of courses and courseware, it can also be re-assembled and ported to a variety of outputs such as lesson books, e-learning and m-learning (mobile-learning).

The example above pointed out another shift in our thinking about learning. Traditionally, our paradigm has been that learning occurs in a classroom, and when it occurs, it has taken the form of a course. In the example above, the learning takes place anywhere and anytime, moving from the formal classroom environment to an informal environment. Of course, just because learning is “informal” does not mean it is accidental, or that it occurs without preparation.

Some estimates claim 10 percent of our learning is achieved through formal channels, 20 percent from coaching, and 70 percent through informal means. Peter Henschel, former director of the Institute for Research on Learning, raised an important question: If nearly three-quarters of learning in corporations is informal, can we afford to leave it to chance?

There are still several open issues regarding informal learning:

  • How do we evaluate the impact/effectiveness of informal learning? (Informal learning, but formal demonstration of competency/proficiency);
  • How do we record one’s participation and skill-level progression in informal learning? (Information learning, but formal recording of learning completion);
  • Who will create and maintain informal learning assets? (Informal learning, but formal maintenance and quality assurance of the learning content); and
  • When does informal learning need a formal owner (in a full- or part-time role)? (Informal learning, but will need formal policies to help drive and manage).
    • In the nuclear industry, accurate and up-to-date documentation is a necessity. As the nuclear industry moves toward more effective use of informal channels of learning, it will need to address these issues.

      Immersive Learning (Or Virtual Worlds)

      The final frontier for the future of learning is expansion into virtual worlds, also known as immersive learning. Although Second Life (SL) is the best known virtual world, there are also emerging competitors, including Active Worlds, Forterra (OLIVE), Qwag and Unisfair.

      Created in 2003 by Linden Lab of San Francisco, SL is a three-dimensional, virtual world that allows users to buy “property,” create objects and buildings and interact with other users. Unlike a game with rules and goals, SL offers an open-ended platform where users can shape their own environment. In this world, avatars do many of the same things real people do: work, shop, go to school, socialize with friends and attend rock concerts.

      From a pragmatic perspective, working in an immersive learning environment such as a virtual world provides several benefits that make it an effective alternative to real life:

      • Movement in 3-D space. A virtual world could be useful in any learning situation involving movement, danger, tactics, or quick physical decisions, such as emergency response.
      • Engendering Empathy. Participants experience scenarios from another person’s perspective. For example, the Future of Learning team is exploring ways to re-create the control room experience during the Three-Mile Island incident, to provide a cathartic experience for the next generation workforce so they can better appreciate the importance of safety and human performance factors.
      • Rapid Prototyping and Co-Design. A virtual world is an inexpensive environment for quickly mocking up prototypes of tools or equipment.
      • Role Playing. By conducting role plays in realistic settings, instructors and learners can take on various avatars and play those characters.
      • Alternate Means of Online Interaction. Although users would likely not choose a virtual world as their primary online communication tool, it provides an alternative means of indicating presence and allowing interaction. Users can have conversations, share note cards, and give presentations. In some cases, SL might be ideal as a remote classroom or meeting place to engage across geographies and utility boundaries.

      Robert Amme, a physicist at the University of Denver, has another laboratory in SL. Funded by a grant from the Nuclear Regulatory Commission, his team is building a virtual nuclear reactor to help train the next generation of environmental engineers on how to deal with nuclear waste (see Figure 1). The INPO Future of Learning team is exploring ways to leverage this type of learning asset as part of the nuclear citizenship initiative.

      There is no doubt that nuclear power generation is once again on an upswing, but critical to its revival and longevity will be the manner in which we prepare the current and next generation of workers to become outstanding stewards of a safe, effective, clean-energy future.

Enabling Successful Business Outcomes Through Value-Based Client Relationships

Utilities are facing a host of challenges ranging from environmental concerns, aging infrastructure and systems, to Smart Grid technology and related program decisions. The future utility will be required to find effective solutions to these challenges, while continuing to meet the increasing expectations of newly empowered consumers. Cost management in addressing these challenges is important, but delivery of value is what truly balances efficiency with customer satisfaction.

Our Commitment

Vertex clients trust us to deliver on our promises and commitments, and they partner with us to generate new ideas that will secure their competitive advantage, while also delivering stakeholder benefits. Our innovative same-side-of-the-table approach allows us to transform the efficiency and effectiveness of your business operations, enabling you to lower your risk profile and enhance your reputation in the eyes of customers, investors and regulatory bodies. Working as partners, we provide unique insights that will generate actionable ideas and help you achieve new levels of operational excellence.

With a long heritage in the utility industry, Vertex possesses an in-depth knowledge and understanding of the issues and challenges facing utility businesses today. We actively develop insights and innovative ideas that allow us to work with our utility clients to transform their businesses, and we can enhance your future performance in terms of greater efficiencies, higher customer satisfaction, increased revenue and improved profitability.

Achievement of desired business outcomes is best achieved with a strategic, structured approach that leverages continuous improvement throughout. Vertex takes a four-level approach, which starts with asking the right questions. Levels 1 and 2 identify business challenges and the corresponding outcomes your utility hopes to achieve. Need to improve customer satisfaction? If so, is moving from the 2nd to 1st quartile the right target? Pinpointing the key business challenges that are limiting or impeding your success is critical. These may include a need to reduce bad debt, reduce costs, minimize billing errors, or improve CSR productivity. Whatever challenges you face, collaboration with our experts will ensure your utility is on the right track to meet or exceed your targets.

Once the challenges and outcomes have been identified and validated, Vertex partners with clients to develop effective solutions. The solutions implemented in Level 3 consist of unique value propositions that, when combined effectively, achieve the desired business outcome for the business challenge being addressed. Vertex’s proprietary “Value Creation Model” enables us to develop and implement solutions that provide measurable business results and ongoing quality assurance.

Inherent to the success of this model is the Vertex Transition Methodology, which has resulted in 200 successful transitions over a twelve-year period. Due diligence yields a clear understanding of how the business operates. Mobilizing activities lay the foundation for the transition, and a baseline for the transition plan is established. The plans developed during the planning stage are implemented, followed by a stabilization period from the business transfer to when things are fully operational.

Another key element of this model lies in Vertex’s transformation capabilities, and what we refer to as our “6D” transformation methodology. Dream, Define, Design, Develop, Deliver, Drive – our Lean Six Sigma methods guarantee successful deployment of continuous process improvement results. In addition to Lean Six Sigma, the Vertex Transformation Methodology includes change management, people and performance management, and project management.

In Level 4 of the Vertex solution approach, Vertex measures the effectiveness of a solution by determining if it achieved the desired business outcome. We utilize a Balanced Scorecard approach to ensure that the business outcome positively impacts all of the key elements of a client’s business: Customer, Employee, Operational, and Financial. As desired business outcomes evolve, Vertex will remain committed to adapting our solutions in partnership with our clients to meet these changing needs.

Transforming Your Organization

If you’re ready to transform to an outcomes- based business, Vertex has the capability to help. Our service lines include: Consulting and Transformation, IT Applications Services and Products, Debt Management, and Meter-to-Cash Outsourcing.

Our transformation approach blends innovation and business process improvement, focusing on achieving your strategic objectives via our proven expertise and insights. We bring business transformation that secures greater efficiencies, improved effectiveness and enhanced services for your organization. All the while we never forget that our employees represent your brand.

We’ll work collaboratively with you, rapidly implementing services and delivering on continuous improvement to meet your goals. We’ll build on your business needs, sharing ideas and jointly developing options for change – working together to deliver real value.

Empower Your Customers To Reduce Energy Demand

The Energy Information Administration (EIA) forecasts a continuing gap between total domestic energy production and consumption through 2030. This delta will not be closed by supply alone; customer behavior changes are needed to reduce total consumption and peak load. Electric and gas utilities face tremendous challenges meeting energy supply and demand needs and will play a pivotal role in determining practical solutions. With the right approach, utilities will deliver on the promise of energy efficiency and demand response.

Energy market projections are highly speculative as the market is characterized by high price volatility and rapid market transformation. Adding to the uncertainty is the voluntary nature of demand response and energy efficiency programs, and the critical importance of customer behavior change. Utilities are spending billions of dollars, making program penetration essential – and customer education paramount. At an end-point cost of up to $300, a five percent penetration is not the answer. Vertex can help mitigate these risks through highly effective management of customer care, CIS integration, pilot programs, and analytics. Vertex’s core “meter-to-cash” capabilities have undergone a major revolution in response to the new world of AMI, energy efficiency, and demand response. A robust set of new services will allow utilities to transform how they do business.

Smart meters put new demands on CIS platforms and traditional business processes – innovative rates, distributed generation, demand response and new customer programs all require creative change. Vertex is currently helping utilities develop and manage customer programs to fully exploit smart meter deployments and provide customer care to customers migrating to time-based rates. We deliver customer management services to drive penetration and designed to meet the unique customer care needs generated by smart meter installations, energy efficiency and demand response programs to empower customers to manage their energy use and reduce consumption, and cost-effective customer care and billing solutions to support smart meters.

Water utilities are not immune to the need for conservation. In the past 30 years, the U.S. population has grown over 50% while the total water use has tripled. On average, Americans use approximately 75 to 80 gallons of water per person per day. Vertex can help water utilities address the unique conservation challenges they face, including customer care and program support, MDMS solutions to organize data for forecasting, code enforcement, business and customer insight, and other services.

Case Study – Hydro One

Hydro One is an Ontario, Canada based utility that is one of the five largest transmission utilities in North America. As the stewards of critical provincial assets, Hydro One works with its industry partners to ensure that electricity can be delivered safely, reliably, and affordably to its customers. Vertex has been providing Meter-to-Cash outsourcing services to Hydro One since 2002.

Applying the Vertex 4-level solutions approach enabled desired business outcomes:

Level 1: Identify Business Challenges

In 2006 Hydro One approached Vertex and indicated that one of their corporate goals was to dramatically improve customer satisfaction as a result of the Hydro One customer satisfaction survey. At that point, Hydro One customer satisfaction scores on agent-handled calls had hovered in the 75-76% range for several years. Up to that time, the relationship with Vertex had focused on significant reductions to cost with no erosion to service offered to customers. Now, Hydro One was looking to Vertex to help lead the drive to improve the customer experience.

Level 2: Identify Desired Outcomes

In 2007 Vertex and Hydro One entered into collaborative discussions to evaluate and analyze the historical customer satisfaction scores, and to work jointly to develop a plan to radically modify the customer experience and improve customer satisfaction. Those discussions led down several paths, and the parties mutually agreed to target the following areas for change:

  • The Vertex/Hydro One Quality program
  • A cultural adjustment that would reflect the change in focus
  • Technology that could help support Hydro One’s goals
  • End-to-end process review

Level 3: Develop & Implement Solution

Vertex has worked closely with Hydro One to help them deliver on their goal of significant improvements to customer satisfaction. Changes were applied to process, call scripts, quality measures and performance scoring at all levels in the organization, including incentive compensation and recognition programs.

Level 4: Measure Solution Results

  • Customer satisfaction scores on agent-handled calls increased from 76% in 2006 to 86% in 2008
  • Quality monitoring program changes yielded a 10% increase in first-call resolution
  • Introduced bi-weekly Process/Quality forums
  • Monthly reviews with the client to reinforce success and progress toward targets

Business Process Improvement

In the past, the utility industry could consider itself exempt from market drivers like those listed above. However, today’s utilities are immersed in a sea of change. Customers demand reliable power in unlimited supply, generated in environmentally friendly ways without increased cost. All the while regulators are telling consumers to “change the way they are using energy or be ready to pay more,” and the Department of Energy is calling for utilities to make significant reductions in usage by 2020 [1].

“The consumer’s concept of quality will no longer be measured by only the physical attributes of the product – it will extend to the process of how the product is made, including product safety, environmental compliance and social responsibility compliance.”

– Victor Fang, chairman of Li and Fang,
in the 2008 IBM CEO Study

If these issues are not enough, couple them with a loss of knowledge and skill due to an aging workforce, an ever-increasing amount of automation and technology being introduced into our infrastructure with few standards, tightening bond markets and economic declines requiring us to do more with less. Now more than ever the industry needs to redefine our core competencies, identify key customers and their requirements, and define processes that meet or exceed their expectations. Business process improvement is essential to ensure future success for utilities.

There is no need to reinvent the wheel and develop a model for utilities to address business process improvement. One already exists that offers the most holistic approach to process improvement today. It is not new, but like any successful management method, it has been modified and refined to meet continuously changing business needs.

It is agnostic in the way it addresses methods used for analysis and process improvement such as Lean, Six Sigma and other tools; but serves as a framework for achieving results in any industry. It is the Baldrige Criteria for Performance Excellence (see Figure 1).

The Criteria for Performance Excellence is designed to assist organizations to focus on strategy-driven performance while addressing key decisions driving both short-term and long-term organizational sustainability in a dynamic environment. Is it possible that this framework was designed for times such as these in the utility industry?

The criteria are essentially simple in design. They are broken into seven categories as shown in figure 2; leadership, strategic planning, customer focus, measurement, analysis and knowledge management, workforce focus, process management and results.

In this model, measurement, analysis and knowledge management establish the foundation. There are two triads. On the left hand side, leadership, strategic planning and customer focus make up the leadership triad. On the right hand side of the model, workforce focus, process management and results make up the results triad. The alignment and integration of these essential elements of business create a framework for continuous improvement. This model should appear familiar in concept to industry leaders; there is not a single utility in the industry that does not identify with these categories in some form.

The criteria are built to elicit a response through the use of how and what questions that ask about key processes and their deployment throughout the organization. On face value, these questions appear to be simple. However, as you respond to them, you will realize their linkage and begin to identify opportunities for improvement that are essential to future success. Leaders wishing to begin this effort should not be surprised by the depth of the questions and the relatively few members within your organization who will be able to provide complete answers.

In assessment of the model’s ability to meet utility industry needs, let’s discuss each category in greater detail, provide relevance to the utility industry and include key questions for you to consider as you begin to assess your own organization’s performance.

Leadership: Who could argue that the current demand for leadership in utilities is more critical today than ever before in our history? Changes in energy markets are bringing with them increased levels of accountability, a greater focus on regulatory, legal and ethical requirements, a need for long-term viability and sustainability, and increased expectations of community support. Today’s leaders are expected to achieve ever increasing levels of operational performance while operating on less margin than ever before.

“The leadership category examines how senior leaders’ personal actions guide and sustain the organization. Also examined are the organization’s governance system and how it fulfills legal, ethical and societal responsibilities as well as how it selects and supports key communities [2].”

Strategic Planning: Does your utility have a strategic plan? Not a dust-laden document sitting on a bookshelf or a financial budget; but a plan that identifies strategic objectives and action plans to address short and long-term goals. Our current business environment demands that we identify our core competencies (and more importantly what are not our core competencies), identify strategic challenges to organizational success, recognize strategic advantages and develop plans that ensure our efforts are focused on objectives that will ensure achievement of our mission and vision.

What elements of our business should we outsource? Do our objectives utilize our competitive advantages and core competencies to diminish organizational challenges? We all know the challenges that are both here today and await us just beyond the horizon. Many of them are common to all utilities; an aging workforce, decreased access to capital, technological change and regulatory change. How are we addressing them today and is our approach systematic and proactive or are we simply reacting to the challenges as they arise?

“The strategic planning category examines how your organization develops strategic objectives and action plans. Also examined are how your chosen strategic objectives and action plans are deployed and changed if circumstances require, and how progress is measured [2].”

Customer Focus: The success of the utility industry has been due in part to a long-term positive relationship with its customers. Most utilities have made a conscientious effort to identify and address the needs of the customer; however a new breed of customer is emerging with greater expectations, a higher degree of sensitivity to environmental issues, a diminished sense of loyalty to business organizations and overall suspicion of ethical and legal compliance.

Their preferred means of communication are quite different than the generations of loyal customers you have enjoyed in the past. They judge your performance against similar customer experiences received from organizations far beyond the traditional competitor.

You now compete against Wal-Mart’s supply chain process, Amazon.com’s payment processes and their favorite hotel chain’s loyalty rewards process. You are being weighed in the balances and in many cases found to be lacking. Worse yet, you may not have even recognized them as an emerging customer segment.

“The Customer Focus category examines how your organization engages its customers for long-term marketplace success and builds a customer-focused culture. Also examined is how your organization listens to the voice of its customers and uses this information to improve and identify opportunities for innovation [2].”

Measurement, Analysis, and Knowledge Management: The data created and maintained by GIS, CIS, AMI, SCADA and other systems create a wealth of information that can be analyzed to obtain knowledge sufficient to make rapid business decisions. However, many of these systems are incapable of or at the very least difficult to integrate with one another, leaving leaders with a lot of data but no meaningful measures of key performance. Even worse, a lack of standards related to system performance leaves many utilities that develop performance measures with a limited number of inconsistently measured comparatives from their peers.

If utilities are going to overcome the challenges of the future, it is essential that they integrate all data systems for improved accessibility and develop standards that would facilitate meaningful comparative measures. This is not to say that comparative measures do not exist, they do. However, increasing the number of utilities participating would increase our understanding of best practices and enable us to determine best-in-class performance.

“The measurement, analysis and knowledge management category examines how the organization selects, gathers, analyzes, manages and improves its data, information and knowledge assets and how it manages its information technology. The category also examines how your organization reviews and uses reviews to improve its performance [2].”

Workforce Focus: We have already addressed the aging workforce and its impact on the future of utilities. Companion challenges related to the utility workforce include the heavy benefits burdens that many utilities currently bear. Also, the industry faces a diminished interest in labor positions and the need to establish new training methods to engage a variety of generations within our workforce and ensure knowledge acquisition and retention.

The new workforce brings with it new requirements for satisfaction and engagement. The new employee has proven to be less loyal to the organization and studies show they will have many more employers before they retire than that of their predecessors. It is essential that we develop ways to identify these requirements and take action to retain these individuals or we risk increased training cost and operational issues as they seek new employment opportunities.

“The workforce focus category examines how your organization engages, manages and develops the workforce to utilize its full potential in alignment with organizational mission, strategy and action plans. The category examines the ability to assess workforce capability and capacity needs and to build a workforce environment conducive to high performance [2].”

Process Management: It is not unusual for utilities to implement new software with dramatically increased capabilities and ask the integrator to make it align with their current processes or continue to use their current processes without regard for the system’s new capabilities. Identifying and mapping key work processes can enable incredible opportunities for streamlining your organization and facilitate increased utilization of technology.

What are your utilities’ key work processes and how do you determine them and their relationship to creating customer value? These are difficult for leaders to articulate; but yet, without a clear understanding of key work processes and their alignment to core competencies and strategic advantages as well as challenges, it may be that your organization is misapplying efforts related to core competencies and either outsourcing something best maintained internally or performing effort that is better delivered by outsource providers.

“The process management category examines how your organization designs its work systems and how it designs, manages and improves its key processes for implementing these work systems to deliver customer value and achieve organizational success and sustainability. Also examined is your readiness for emergencies [2].”

Results: Results are the fruit of your efforts, the gift that the Baldrige Criteria enables you to receive from your applied efforts. All of us want positive results. Many utilities cite positive performance in measures that are easy to acquire: financial performance, safety performance, customer satisfaction. But which of these measures are key to our success and sustainability as an organization? As you answer the questions and align measures that are integral to obtaining your organization’s mission and vision, it will become abundantly clear which measures you’ll need to maintain and develop competitive comparisons and benchmarks.

“The results category examines the organization’s performance and improvement in all key areas – product outcomes, customer-focused outcomes, financial and market outcomes, workforce-focused outcomes, process-effectiveness outcomes and leadership outcomes. Performance levels are examined relative to those of competitors and other organizations with similar product offerings [2].”

A Challenge

The adoption of the Baldrige criteria is often described as a journey. Few utilities have embraced this model. However, it appears to offer a comprehensive solution to the challenges we face today. Utilities have a rich history and play a positive role in our nation. A period of rapid change is upon us. We need to shift from reacting to leading as we solve the problems that face our industry. By applying this model for effective process improvement, we can once again create a world where utilities lead the future.

References

  1. Quote from U.S. Treasury Secretary Tim Geithner as communicated in SmartGrid Newsletter
  2. Malcolm Baldrige National Quality Award, “Path to Excellence and Some path Building Tools.” www.nist.gov/baldrige.

The Role of Telecommunications Providers in the Smart Grid

Utilities are facing a host of critical issues over the next 10 years. One of the major approaches to dealing with these challenges is for utilities to become much more "intelligent" through the development of Intelligent Utility Enterprises (IUE) and Smart Grids (SG). The IUE/SG will require ubiquitous communications systems throughout utility service territories, especially as automated metering infrastructure (AMI) becomes a reality. Wireless systems, such as the widespread cellular system AT&T and other public carriers already have, will play a major role in enabling these systems.

These communications must be two-way, all the way from the utility to individual homes. The Smart Grid will be a subset of the intelligent utility, enabling utility executives to make wise decisions to deal with the pending issues. Public carriers are currently positioned to support and provide a wide range of communications technologies and services such as WiFi, satellite and cellular, which it is continuing to develop to meet current and future utility needs.

Supply and demand reaching critical concern

Utilities face some formidable mountains in the near future and they must climb these in the crosshairs of regulatory, legislative and public scrutiny. Included are such things as a looming, increasing shortage of electricity which may become more critical as global warming concerns begin to compromise the ability to build large generating plants, especially those fueled by coal.

Utilities also have to contend with the growing political strength of an environmental movement that opposes most forms of generation other than those designated as "green energy." Thus, utilities face a political/legislative/regulatory perfect storm, on the one hand reducing their ability to generate electricity by conventional methods and, on the other, requiring levels of reliability they increasingly are finding it impossible to meet.

The Intelligent Utility Enterprise and Smart Grid, with AMI as a subset of the Smart Grid, as potential, partial solutions

The primary solution proposed to date, which utilities can embrace on their own without waiting for regulatory/legislative/ political clarity, is to use technology like IUEs to become much more effective organizations and to substitute intelligence in lieu of manpower with SGs. The Smart Grid evolution also will enable the general public to take part in solving these problems through demand response. A subset of that evolution will be outage management to ensure that outages are anticipated and, except where required by supply shortages, minimized rapidly and effectively.

The IUE/SG, for the first time, will enable utility executives to see exactly what is happening on the grid in real time, so they can make the critical, day-to-day decisions in an environment of increasingly high prices and diminished supply for electricity.

Wireless To Play A Major Role In Required Ubiquitous Communications

Automating the self-operating, self-healing grid – artificial intelligence

The IUE/SG obviously will require enterprise-wide digital communications to enable the rapid transfer of data between one system and another, all the way from smart meters and other in-home gateways to the boardrooms where critical decisions will be made. Already utilities have embraced service-oriented architecture (SOA), as a means of linking everything together. SOA-enabled systems are easily linked over IP, which is capable of operating over traditional wire and fiber optic communications systems, which many utilities have in place, as well as existing cellular wireless systems. Wireless communications are becoming more helpful in linking disparate systems from the home, through the distribution systems, to substations, control rooms and beyond to the enterprise. The ubiquitous utility communications of the future will integrate a wide range of systems, some of them owned by the utilities and others leased and contracted by various carriers.

The Smart Grid is a subset of the entire utility enterprise and is linked to the boardroom by various increasingly intelligent systems throughout.

Utility leadership will need vital information about the operation of the grid all the way into the home, where distributed generation, net billing, demand response reduction of voltage or current will take place. This communications network must be in real time and must provide information to all of what traditionally were called "back office" systems, but which now must be capable of collating information never before received or considered.

The distribution grid itself will have to become much more automated, self-healing, and self-operating through artificial intelligence. Traditional SCADA (supervisory control and data acquisition) will have to become more capable, and the data it collects will have to be pushed further up into the utility enterprise and to departments that have not previously dealt with real-time data.

The communications infrastructure In the past utilities typically owned much of their communications systems. Most of these systems are aged, and converting them to modern digital systems is difficult and expensive.

Utilities are likely to embrace a wide range of new and existing communications technologies as they grapple with their supply/demand disconnect problem. One of these is IP/MPLS (Internet Protocol/Multi Protocol Label Switching), which already is proven in utility communications networks as well as other industries which require mission critical communications. MPLS is used to make communications more reliable and provide the prioritization to ensure the required latency for specific traffic.

One of the advantages offered by public carriers is that their networks have almost ubiquitous coverage of utility service territories, as well as built-in switching capabilities. They also have been built to communications standards that, while still evolving, help ensure important levels of security and interoperability.

"Cellular network providers are investing billions of dollars in their networks," points out Henry L. Jones II, chief technology officer at SmartSynch, an AMI vendor and author of the article entitled "Want six billion dollars to invest in your AMI network?"

"AT&T alone will be spending 16-17 billion dollars in 2009," Jones notes. "Those investments are spent efficiently in a highly competitive environment to deliver high-speed connectivity anywhere that people live and work. Of course, the primary intent of these funds is to support mobile users with web browsing and e-mail. Communicating with meters is a much simpler proposition, and one can rely on these consumer applications to provide real-world evidence that scalability to system-wide AMI will not be a problem."

Utilities deal in privileged communications with their customers, and their systems are vulnerable to terrorism. As a result, Congress, through the Federal Energy Regulatory Authority (FERC), designated NERC as the agency responsible for ensuring security of all utility facilities, including communications.

As an example of meeting security needs at a major utility, AT&T is providing redundant communications systems over a wireless WAN for a utility’s 950 substations, according to Andrew Hebert, AT&T Area Vice President, Industry Solutions Mobility Practice. This enables the utility to meet critical infrastructure protection standards and "harden" its SCADA and distribution automation systems by providing redundant communications pathways.

SCADA communication, distributed automation, and even devices providing artificial intelligence reporting are possible with today’s modern communications systems. Latency is important in terms of automatic fault reporting and switching. The communications network must provide the delivery-time performance to this support substation automation as identified in IEEE 1646. Some wireless systems now offer latencies in the 125ms range. Some of the newer systems are designed for no more than 50ms latency.

As AMI becomes more widespread, the utility- side control of millions of in-home and in-business devices will have to be controlled and managed. Meter readings must be collected and routed to meter data management systems. While it is possible to feed all this data directly to some central location, it is likely that this data avalanche will be routed through substations for aggregation and handling and transfer to corporate WANs. As the number of meter points grows – and the number readings taken per hour and the number of in-home control signals increases, bandwidth and latency factors will have to be considered carefully.

Public cellular carriers already have interoperability (e.g., you can call someone on a cell phone although they use a different carrier), and it is likely that there will be more standardization of communications systems going forward. A paradigm shift toward national and international communications interoperability already has occurred – for example, with the global GSM standard on which the AT&T network is based. A similar shift in the communications systems utilities use is necessary and likely to come about in the next few years. It no longer is practical for utilities to have to cobble together communications with varying standards for different portions of their service territory, or different functional purposes.

Surviving the Turmoil

With the new administration talking about a trillion dollars of infrastructure investment, the time for the intelligent utility of the future is now. Political pressure and climate change are going to drive massive investments in renewable and clean energy and smart grid technology. These investments will empower customers through the launch and adoption of demand response and energy efficiency programs.

Many believe that the utility industry will change more in the next five years than the previous 50. The greatest technological advancements are only valuable if they can enable desired business outcomes. In a world of rapidly changing technology it is easy to get caught up in the decisions of what to put in, how, when, and where – making it easy to forget why.

A New Era Emerges

The utility industry has, for decades, been the sleeping giant of the U.S. economy. Little has changed in service delivery and consumer options over the last 50 years. But a perfect storm of legislation, funding and technology has set in motion new initiatives that will change the way customers use and think about their utility service. The American Recovery and Reinvestment Act allocates more than $4 billion, via the Smart Grid Investment Grant Program, for development and upgrade of the electrical grid. Simultaneously, significant strides in smart metering technology make the prospect of a rewired grid more feasible.

While technological advances toward the intelligent utility are exciting, technology in and of itself is not the solution for the utility of the future. How those technologies are applied to supporting business outcomes will be key to success in a consumer-empowered environment. Those outcomes must include considerations such as increasing or sustaining customer service levels and reducing bad debt through innovative charging methods and better control of consumption patterns.

Facing New Challenges

Future smart grid considerations aside, consumer expectations are already undergoing transformation. Although some energy prices have decreased recently in light of declining natural gas prices, the long-term trend indicates rates will continue to climb. Faced with increasing energy costs and declining household incomes, customers are looking for options to reduce their utility bill. Further, utilities’ ability to meet demand during peak periods is often inadequate. According to the Galvin Electricity Initiative, “Each day, roughly 500,000 Americans spend at least two hours without electricity in their homes and businesses. Such outages cost at least $150 billion a year. The future looks even worse. Without substantial innovation and investment, rolling blackouts and soaring power bills will become a persistent fact of life [1].”

Simultaneously, environmental concerns are influencing a greater number of consumers than in the past. In April 2009, the U.S. Environmental Protection Agency (EPA) announced it had identified six greenhouse gases that may endanger public health or welfare [2]. According to the EPA, the process of generating electricity creates 41 percent of all carbon dioxide emissions in the U.S. Utilities are under pressure to offer ways to reduce the impact of fossil fuels to accommodate rapidly changing economic and social conditions.

Strategies such as rate structures that incent customers to schedule their energy-intensive activities during off-peak times would help the utility to avoid, or reduce, reliance on the facilities that produce greenhouse gases. Lowering a residential thermostat by just 2 degrees reduces reliance on less desirable sources of generation. According to McKinsey &
Company, carbon dioxide emissions can be reduced by 34 percent in the residential sector alone through enhanced energy productivity [3].

If a significant number of residential consumers could reschedule their peak usage today, it would extend the life of the current infrastructure and reduce the need to raise rates in order to fund capital investments. But at present, in most jurisdictions there is no demonstrable incentive, such as rate structures that reward off-peak usage, to motivate consumers to conserve in any meaningful way.

Aging CIS

Those utilities saddled with aging customer information systems (CIS) – and those executives who have been reluctant to adopt new technology – will be challenged to adapt to the new paradigm. Even utilities with a relatively new CIS in place may find themselves with technology not suited to today’s world. Typically, utilities have been “load serving entities” – matching supply to demand. In the new recession-prone environment, proactive utilities will need to encourage conservation to match supply. Most utilities do not have the capability to show consumers how and when they can save money by using electricity during off-peak hours.

Until utilities can address these needs, and answer customer inquiries about how to save money and energy, they will not be in a position to focus on desired business outcomes. Currently, many utilities track quantitative performance indicators, not business outcomes.

Desired Business Outcomes

Determining the tools, processes or intellectual property needed to achieve desired business outcomes can be a dilemma. Realizing targeted results may require out-of-the-box thinking. To leverage best-in-class practices, many utilities seek external expertise ranging from advisory and consulting resources to a fully outsourced solution.

When addressing the changes the future utility faces, it is easy to become focused on the what, how, when and where to deploy emerging technology rather than the most important element – why deploy at all? Figure 1 depicts Vertex’s four-level solutions approach to business outcomes as an example of keeping the focus on the “why.”

Level 1: Identify Business Challenges. What are the key issues your organization is grappling with? They may be part of the macro trends impacting the industry as a whole or they may be specific to your company. The list might include issues such as substantial bad debt, poor customer satisfaction, declining revenue and profits, high operating cost to serve, and customer acquisition and retention.

Level 2: Identify Desired Outcomes. While acting on business challenges is an integral part of the process, the desired business outcomes are the drivers that will guide you to the solution. At the same time, the solution will also determine if the desired outcomes can be achieved with in-house resources or if an experienced third party should join the team. The solution will also clarify whether you have the technology to realize the desired outcomes or if an investment will be necessary. For example, desired outcomes might include reducing bad debt by 10 percent, improving customer satisfaction from the second quartile to the first quartile, or eliminating 30 percent of the cost of the meter-to-cash process. One or more of these outcomes may require new supporting technology.

Level 3: Develop and Implement Solution. Once the specific business challenges have been fully discussed and the desired outcomes outlined, the next step requires designing the solution to enable achievement. The solution needs to be realistic, in line with your corporate culture, and deliver the right mix of technology, innovation and practicality, all with the appropriate cost-to-value ratio. Management must avoid the lure of overengineering to meet the goal, and thereby incurring more expense and complexity than needed. And the journey from perceived solution to actual solution to achieve a desired outcome might include some surprising elements.

For example, accomplishing the goal of reducing customer service costs by 30 percent might call for enhanced customer service representative (CSR) education and a reduction in the average number of calls a customer makes to the call center each year. The eventual solution may be very complex, and require touching all areas of the meter-to-cash process, along with implementing next generation technology. Or the solution may be as simple as upgrading the customer’s bill to provide more accurate and timely information. Putting more information in the customer’s hands makes billing easier to understand, resulting in fewer customer calls per year, leading to lower customer service costs. The value proposition enabling the business outcome might rely on a more robust analytics engine for analyzing and presenting data to customers. There are generally multiple paths that can bring about achieving a desired business outcome. Seeking external help on the pros and cons of the paths might be valuable to utility executives,
especially if the path involves deploying new technology.

Level 4: Measure Solution Results. Continuous process improvement must be a component of all solutions. The results must be measured and compared against the desired business outcomes. Reviewing results and lessons learned in a closed loop will empower continuous process improvement and maintain focus on the process.

Conservation and Education

While current technology may not be up to the task of helping consumers conserve and save money on energy, those restrictions will change in the very near future. Utilities need to start viewing themselves less as responders to supply and demand and more as advocates for conservation, the environment, and de-coupling of rates. Massive investments in clean and renewable energy, and smart grid technology, will empower customers to employ demand response decisions and gain energy efficiency. The real issue for the utility will not be how to implement the technology itself – wired, wireless, satellite, etc. – but how best to use the technology to achieve its desired business outcomes. Further, utilities need to be prepared for some disruption to business as usual while technology and business processes undergo a sea change.

The capability of deploying a smart grid and advanced meter management (AMM) is one of the most significant changes impacting utilities today. The outcomes are not achieved by technology alone. Those outcomes require the merging of AMM with meter-to-cash processes. The utility will realize business value only if the people and discrete processes within the customer care component of the end-toend process evolve to take advantage of new technology.

The New Reality

Most utilities already enjoy acceptable levels of customer satisfaction. As the smart grid comes on line, with its associated learning curve, myriad details and inevitable glitches, customers will depend on the utility for support and clarification. Call center volumes and average handle times will increase as the complexity of the product grows by an order of magnitude. The old standard of measuring productivity according to number of calls completed within a pre-determined number of minutes will no longer be viable. Average call length increased by a factor of four for one utility that has experimented with smart grid technology. Longer call times, however, can ultimately translate to increased customer satisfaction as consumers receive the information they need to understand the new system and how to reduce their energy bill.

But a four-fold increase in call center staff to accommodate longer calls is not economically practical. In the future, utilities will need to provide more in-depth education to CSRs so they can, in turn, educate customers. They may even need to change their hiring criteria, and seek more highly skilled call center staff who are already versed in the meter-to-cash process. For some customers, alternative sources of information such as the Internet will suffice, thus offsetting some of the strain placed on the call center.

Achieving Desired Outcomes

The following section provides examples of how the combination of advanced meter management and redefined meter-to-cash processes and tools can enable and help achieve desired business outcomes.

Accurate and Timely Data – With smart meters and the smart grid able to capture usage data in intervals as frequent as five minutes, utilities will have more current information about system activity than ever before. Developing a strategy for managing this massive database will require forethought to avoid overwhelming the back office. When fully deployed throughout a service area, customers will no longer receive estimated bills. Devices in the home will provide readouts about usage activity, and some consumer education may be needed to help households understand the presented data and how it translates to their usage patterns and billing. Demand response participation is likely to increase as consumers become more aware of the benefits of managing their energy usage patterns. The federal government’s stimulus bill funding may include allocations for retrofits for low-income homeowners. The call center can function as a resource for customers who wish to investigate this program.

Reduced Bad Debt – As noted earlier, average handle time will be a less significant metric as consumer interaction with the call center increases. The CSR will become a key element in the strategy to reduce bad debt. CSRs will be the conduit for consumer education and building rapport with the customer when resolving past-due bills. As an alternative, utilities may want to turn to Madison Avenue to help them design and roll out a customer information campaign.

Better Revenue Management – If customer education about the smart grid pays off, and consumers are using energy more judiciously, utilities will benefit. Without the pressure to make capital investments for new plants, there will be more opportunities for profit-taking and shareholder rewards. Utilities may instead be able to make profits on their energy efficiency and investments. New technologies will help utilities avoid spending the hundreds of billions of dollars that would otherwise be needed for base load. In addition, demand response participation on the part of residential consumers will better align commercial and industrial (C&I) energy pricing with residential pricing. C&I customers will see the quality and consistency of their power supply improve.

Increased Energy Efficiency – Utilities, whether municipal, public or private, will feel the social pressure to apply technologies in order to gain energy efficiency and encourage conservation. The future utility will become a leader, instead of a follower, in the campaign to improve the environment and use energy resources wisely. By using energy more strategically – that is, understanding the benefits of off-peak usage – consumers will help their utility reduce carbon emissions, which is the ultimate desired business outcome for all involved.

Increased Stakeholder Satisfaction – Stakeholders run the gamut from shareholders and public utility commissions to consumers, utility employees and executives. All of these groups will be pleased if the public uses energy more efficiently, leading to more revenue for the utility and lower costs to consumers. Showing focus on business outcomes is generally a huge plus that helps increase stakeholder satisfaction.

Lower Cost to Serve – Utilities must try to design a business model with flatter delivery costs. For example, if it costs the utility $30 to $40 per customer per year, staying within that existing range with more and longer customer calls will be a challenge. Some utilities may opt out of providing customer service with in-house staff and contract with a service provider. Recognizing that supplying and managing energy, not delivering customer care, is their core competency, a utility can often reduce the cost of customer care by partnering with an organization that is an expert in this business process. If this is the path a utility takes it is very important to find the provider that will enable the desired outcomes of your business; not all service providers are equal or focus on outcomes. We expect relationships with vendors within the industry will change, with utilities embracing more business partners than in the past.

Increased Service Levels – Public utility commissions (PUC) often review financial and service metrics when considering a rate case. Utilities may need to collaborate with PUCs to help them understand the dynamics of smart meters, along with temporary changes in customer satisfaction and service levels, when submitting innovative rate cases and programs. Once the initial disruptive period of new technology is completed, utilities will be able to increase service levels with greater responsiveness to customer needs. When the call center staff is fully educated about smart meters and demand response, they will be positioned to provide customers with more comprehensive service, thus reducing the number of incoming and outgoing calls.

Future Competition – The current and upcoming changes in the industry are so dramatic that utilities must first assess how consumers are accepting change. Reinventing the grid via the smart grid and its related products and services will create new opportunities and new business models with potential for increased revenue. The extent to which the future market is more competitive depends on the rate of acceptance by consumers and how skillfully utilities adopt new business models. It is our premise that utilities who desire the right business outcomes and focus on enabling them through process, people, and technological changes will be most able to excel in a more competitive environment.

References

  1. Galvin Electricity Initiative, sponsored by The Galvin Project, Inc., www.galvinpower.org
  2. Press Release, “EPA Finds Greenhouse Gases Pose Threat to Public Health, Welfare/Proposed Finding Comes in Response to 2007 Supreme Court Ruling,” April 17, 2009. http://yosemite.epa.gov
  3. McKinsey Global Institute, “Wasted Energy: How the US Can Reach its Energy Productivity Potential,” McKinsey
    & Company, June 2007.