The Role of Telecommunications Providers in the Smart Grid

Utilities are facing a host of critical issues over the next 10 years. One of the major approaches to dealing with these challenges is for utilities to become much more "intelligent" through the development of Intelligent Utility Enterprises (IUE) and Smart Grids (SG). The IUE/SG will require ubiquitous communications systems throughout utility service territories, especially as automated metering infrastructure (AMI) becomes a reality. Wireless systems, such as the widespread cellular system AT&T and other public carriers already have, will play a major role in enabling these systems.

These communications must be two-way, all the way from the utility to individual homes. The Smart Grid will be a subset of the intelligent utility, enabling utility executives to make wise decisions to deal with the pending issues. Public carriers are currently positioned to support and provide a wide range of communications technologies and services such as WiFi, satellite and cellular, which it is continuing to develop to meet current and future utility needs.

Supply and demand reaching critical concern

Utilities face some formidable mountains in the near future and they must climb these in the crosshairs of regulatory, legislative and public scrutiny. Included are such things as a looming, increasing shortage of electricity which may become more critical as global warming concerns begin to compromise the ability to build large generating plants, especially those fueled by coal.

Utilities also have to contend with the growing political strength of an environmental movement that opposes most forms of generation other than those designated as "green energy." Thus, utilities face a political/legislative/regulatory perfect storm, on the one hand reducing their ability to generate electricity by conventional methods and, on the other, requiring levels of reliability they increasingly are finding it impossible to meet.

The Intelligent Utility Enterprise and Smart Grid, with AMI as a subset of the Smart Grid, as potential, partial solutions

The primary solution proposed to date, which utilities can embrace on their own without waiting for regulatory/legislative/ political clarity, is to use technology like IUEs to become much more effective organizations and to substitute intelligence in lieu of manpower with SGs. The Smart Grid evolution also will enable the general public to take part in solving these problems through demand response. A subset of that evolution will be outage management to ensure that outages are anticipated and, except where required by supply shortages, minimized rapidly and effectively.

The IUE/SG, for the first time, will enable utility executives to see exactly what is happening on the grid in real time, so they can make the critical, day-to-day decisions in an environment of increasingly high prices and diminished supply for electricity.

Wireless To Play A Major Role In Required Ubiquitous Communications

Automating the self-operating, self-healing grid – artificial intelligence

The IUE/SG obviously will require enterprise-wide digital communications to enable the rapid transfer of data between one system and another, all the way from smart meters and other in-home gateways to the boardrooms where critical decisions will be made. Already utilities have embraced service-oriented architecture (SOA), as a means of linking everything together. SOA-enabled systems are easily linked over IP, which is capable of operating over traditional wire and fiber optic communications systems, which many utilities have in place, as well as existing cellular wireless systems. Wireless communications are becoming more helpful in linking disparate systems from the home, through the distribution systems, to substations, control rooms and beyond to the enterprise. The ubiquitous utility communications of the future will integrate a wide range of systems, some of them owned by the utilities and others leased and contracted by various carriers.

The Smart Grid is a subset of the entire utility enterprise and is linked to the boardroom by various increasingly intelligent systems throughout.

Utility leadership will need vital information about the operation of the grid all the way into the home, where distributed generation, net billing, demand response reduction of voltage or current will take place. This communications network must be in real time and must provide information to all of what traditionally were called "back office" systems, but which now must be capable of collating information never before received or considered.

The distribution grid itself will have to become much more automated, self-healing, and self-operating through artificial intelligence. Traditional SCADA (supervisory control and data acquisition) will have to become more capable, and the data it collects will have to be pushed further up into the utility enterprise and to departments that have not previously dealt with real-time data.

The communications infrastructure In the past utilities typically owned much of their communications systems. Most of these systems are aged, and converting them to modern digital systems is difficult and expensive.

Utilities are likely to embrace a wide range of new and existing communications technologies as they grapple with their supply/demand disconnect problem. One of these is IP/MPLS (Internet Protocol/Multi Protocol Label Switching), which already is proven in utility communications networks as well as other industries which require mission critical communications. MPLS is used to make communications more reliable and provide the prioritization to ensure the required latency for specific traffic.

One of the advantages offered by public carriers is that their networks have almost ubiquitous coverage of utility service territories, as well as built-in switching capabilities. They also have been built to communications standards that, while still evolving, help ensure important levels of security and interoperability.

"Cellular network providers are investing billions of dollars in their networks," points out Henry L. Jones II, chief technology officer at SmartSynch, an AMI vendor and author of the article entitled "Want six billion dollars to invest in your AMI network?"

"AT&T alone will be spending 16-17 billion dollars in 2009," Jones notes. "Those investments are spent efficiently in a highly competitive environment to deliver high-speed connectivity anywhere that people live and work. Of course, the primary intent of these funds is to support mobile users with web browsing and e-mail. Communicating with meters is a much simpler proposition, and one can rely on these consumer applications to provide real-world evidence that scalability to system-wide AMI will not be a problem."

Utilities deal in privileged communications with their customers, and their systems are vulnerable to terrorism. As a result, Congress, through the Federal Energy Regulatory Authority (FERC), designated NERC as the agency responsible for ensuring security of all utility facilities, including communications.

As an example of meeting security needs at a major utility, AT&T is providing redundant communications systems over a wireless WAN for a utility’s 950 substations, according to Andrew Hebert, AT&T Area Vice President, Industry Solutions Mobility Practice. This enables the utility to meet critical infrastructure protection standards and "harden" its SCADA and distribution automation systems by providing redundant communications pathways.

SCADA communication, distributed automation, and even devices providing artificial intelligence reporting are possible with today’s modern communications systems. Latency is important in terms of automatic fault reporting and switching. The communications network must provide the delivery-time performance to this support substation automation as identified in IEEE 1646. Some wireless systems now offer latencies in the 125ms range. Some of the newer systems are designed for no more than 50ms latency.

As AMI becomes more widespread, the utility- side control of millions of in-home and in-business devices will have to be controlled and managed. Meter readings must be collected and routed to meter data management systems. While it is possible to feed all this data directly to some central location, it is likely that this data avalanche will be routed through substations for aggregation and handling and transfer to corporate WANs. As the number of meter points grows – and the number readings taken per hour and the number of in-home control signals increases, bandwidth and latency factors will have to be considered carefully.

Public cellular carriers already have interoperability (e.g., you can call someone on a cell phone although they use a different carrier), and it is likely that there will be more standardization of communications systems going forward. A paradigm shift toward national and international communications interoperability already has occurred – for example, with the global GSM standard on which the AT&T network is based. A similar shift in the communications systems utilities use is necessary and likely to come about in the next few years. It no longer is practical for utilities to have to cobble together communications with varying standards for different portions of their service territory, or different functional purposes.

Managing Communications Change

Change is being forced upon the utilities industry. Business drivers range from stakeholder pressure for greater efficiency to the changing technologies involved in operational energy networks. New technologies such as intelligent networks or smart grids, distribution automation or smart metering are being considered.

The communications network is becoming the key enabler for the evolution of reliable energy supply. However, few utilities today have a communications network that is robust enough to handle and support the exacting demands that energy delivery is now making.

It is this process of change – including the renewal of the communications network – that is vital for each utility’s future. But for the utility, this is a technological step change requiring different strategies and designs. It also requires new skills, all of which have been implemented in timescales that do not sit comfortably with traditional technology strategies.

The problems facing today’s utility include understanding the new technologies and assessing their capabilities and applications. In addition, the utility has to develop an appropriate strategy to migrate legacy technologies and integrate them with the new infrastructure in a seamless, efficient, safe and reliable manner.

This paper highlights the benefits utilities can realize by adopting a new approach to their customers’ needs and engaging a network partner that will take responsibility for the network upgrade, its renewal and evolution, and the service transition.

The Move to Smart Grids

The intent of smart grids is to provide better efficiency in the production, transport and delivery of energy. This is realized in two ways:

  • Better real-time control: ability to remotely monitor and measure energy flows more closely, and then manage those flows and the assets carrying them in real time.
  • Better predictive management: ability to monitor the condition of the different elements of the network, predict failure and direct maintenance. The focus is on being proactive to real needs prior to a potential incident, rather than being reactive to incidents, or performing maintenance on a repetitive basis whether it is needed or not.

These mechanisms imply more measurement points, remote monitoring and management capabilities than exist today. And this requires a greater reliance on reliable, robust, highly available communications than has ever been the case before.

The communications network must continue to support operational services independently of external events, such as power outages or public service provider failure, yet be economical and simple to maintain. Unfortunately, the majority of today’s utility communications implementations fall far short of these stringent requirements.

Changing Environment

The design template for the majority of today’s energy infrastructure was developed in the 1950s and 1960s – and the same is true of the associated communications networks.

Typically, these communications networks have evolved into a series of overlays, often of different technology types and generations (see Figure 1). For example, protection tends to use its own dedicated network. The physical realization varies widely, from tones over copper via dedicated time division multiplexing (TDM) connections to dedicated fiber connections. These generally use a mix of privately owned and leased services.

Supervisory control and data acquisitions systems (SCADA) generally still use modem technology at speeds between 300 baud to 9.6k baud. Again, the infrastructure is often copper or TDM running as one of many separate overlay networks.

Lastly, operational voice services (as opposed to business voice services) are frequently analog on yet another separate network.

Historically, there were good operational reasons for these overlays. But changes in device technology (for example, the evolution toward e-SCADA based on IP protocols), as well as the decreasing support by communications equipment vendors of legacy communications technologies, means that the strategy for these networks has to be reassessed. In addition, the increasing demand for further operational applications (for example, condition monitoring, or CCTV, both to support substation automation) requires a more up-to-date networking approach.

Tomorrow’s Network

With the exception of protection services, communications between network devices and the network control centers are evolving toward IP-based networks (see Figure 2). The benefits of this simplified infrastructure are significant and can be measured in terms of asset utilization, reduced capital and operational costs, ease of operation, and the flexibility to adapt to new applications. Consequently, utilities will find themselves forced to seriously consider the shift to a modern, homogeneous communications infrastructure to support their critical operational services.

Organizing For Change

As noted above, there are many cogent reasons to transform utility communications to a modern, robust communications infrastructure in support of operational safety, reliability and efficiency. However, some significant considerations should be addressed to achieve this transformation:

Network Strategy. It is almost inevitable that a new infrastructure will cross traditional operational and departmental boundaries within the utility. Each operational department will have its own priorities and requirements for such a network, and traditionally, each wants some, or total, control. However, to achieve real benefits, a greater degree of centralized strategy and management is required.

Architecture and Design. The new network will require careful engineering to ensure that it meets the performance-critical requirements of energy operations. It must maintain or enhance the safety and reliability of the energy network, as well as support the traffic requirements of other departments.

Planning, Execution and Migration. Planning and implementation of the core infrastructure is just the start of the process. Each service requires its own migration plan and has its own migration priorities. Each element requires specialist technical knowledge, and for preference, practical field experience.

Operation. Gone are the days when a communications failure was rectified by sending an engineer into the field to find the fault and to fix it. Maintaining network availability and robustness calls for sound operational processes and excellent diagnostics before any engineer or technician hits the road. The same level of robust centralized management tools and processes that support the energy networks have to be put in place to support communications network – no matter what technologies are used in the field.

Support. Although these technologies are well understood by the telecommunications industry, they are likely to be new to the energy utilities industry. This means that a solid support organization familiar with these technologies must be implemented. The evolution process requires an intense level of up-front skills and resources. Often these are not readily available in-house – certainly not in the volume required to make any network renewal or transformation effective. Building up this skill and resource base by recruitment will not necessarily yield staff that is aware of the peculiarities of the energy utilities market. As a result, there will be significant time lag from concept to execution, and considerable risk for the utility as it ventures alone into unknown territory.

Keys To Successful Engagement

Engaging a services partner does not mean ceding control through a rigid contract. Rather, it means crafting a flexible relationship that takes into consideration three factors: What is the desired outcome of the activity? What is the best balance of scope between partner assistance and in-house performance to achieve that outcome? How do you retain the flexibility to accommodate change while retaining control?

Desired outcome is probably the most critical element and must be well understood at the outset. For one utility, the desired outcome may be to rapidly enable the upgrade of the complete energy infrastructure without having to incur the upfront investment in a mass recruitment of the required new communications skills.

For other utilities, the desired outcome may be different. But if the outcomes include elements of time pressure, new skills and resources, and/or network transformation, then engaging a services partner should be seriously considered as one of the strategic options.

Second, not all activities have to be in scope. The objective of the exercise might be to supplement existing in-house capabilities with external expertise. Or, it might be to launch the activity while building up appropriate in-house resources in a measured fashion through the Build-Operate- Transfer (BOT) approach.

In looking for a suitable partner, the utility seeks to leverage not only the partner’s existing skills, but also its experience and lessons learned performing the same services for other utilities. Having a few bruises is not a bad thing – this means that the partner understands what is at stake and the range of potential pitfalls it may encounter.

Lastly, retaining flexibility and control is a function of the contract between the two parties which should be addressed in their earliest discussions. The idea is to put in place the necessary management framework and a robust change control mechanism based on a discussion between equals from both organizations. The utility will then find that it not only retains full control of the project without having to take day-to-day responsibility for its management, but also that it can respond to change drivers from a variety of sources – such as technology advances, business drivers, regulators and stakeholders.

Realizing the Benefits

Outsourcing or partnering the communications transformation will yield benefits, both tangible and intangible. It must be remembered that there is no standard “one-size-fits-all” outsourcing product. Thus, the benefits accrued will depend on the details of the engagement.

There are distinct tangible benefits that can be realized, including:

Skills and Resources. A unique benefit of outsourcing is that it eliminates the need to recruit skills not available internally. These are provided by the partner on an as-needed basis. The additional advantage for the utility is that it does not have to bear the fixed costs once they are no longer required.

Offset Risks. Because the partner is responsible for delivery, the utility is able to mitigate risk. For example, traditionally vendors are not motivated to do anything other than deliver boxes on time. But with a well-structured partnership, there is an incentive to ensure that the strategy and design are optimized to economically deliver the required services and ease of operation. Through an appropriate regime of business-related key performance indicators (KPIs), there is a strong financial incentive for the partner to operate and upgrade the network to maintain peak performance – something that does not exist when an in-house organization is used.

Economies of Scale. Outsourcing can bring the economies of scale resulting from synergies together with other parts of the partner’s business, such as contracts and internal projects.

There also are many other benefits associated with outsourcing that are not as immediately obvious and commercially quantifiable as those listed above, but can be equally valuable.

Some of these less tangible benefits include:

Fresh Point of View. Within most companies, employees often have a vested interest in maintaining the status quo. But a managed services organization has a vested interest in delivering the best possible service to the customer – a paradigm shift in attitude that enables dramatic improvements in performance and creativity.

Drive to Achieve Optimum Efficiency. Executives, freed from the day-to-day business of running the network, can focus on their core activities, concentrating on service excellence rather than complex technology decisions. To quote one customer, “From my perspective, a large amount of my time that might have in the past been dedicated to networking issues is now focused on more strategic initiatives concerned with running my business more effectively.”

Processes and Technologies Optimization. Optimizing processes and technologies to improve contract performance is part of the managed services package and can yield substantial savings.

Synergies with Existing Activities Create Economies of Scale. A utility and a managed services vendor have considerable overlap in the functions performed within their communications engineering, operations and maintenance activities. For example, a multi-skilled field force can install and maintain communications equipment belonging to a variety of customers. This not only provides cost savings from synergies with the equivalent customer activity, but also an improved fault response due to the higher density of deployed staff.

Access to Global Best Practices. An outsourcing contract relieves a utility of the time-consuming and difficult responsibility of keeping up to speed with the latest thinking and developments in technology. Alcatel-Lucent, for example, invests around 14 percent of its annual revenue into research and development; its customers don’t have to.

What Can Be Outsourced?

There is no one outsourcing solution that fits all utilities. The final scope of any project will be entirely dependent on a utility’s specific vision and current circumstances.

The following list briefly describes some of the functions and activities that are good possibilities for outsourcing:

Communications Strategy Consulting. Before making technology choices, the energy utility needs to define the operational strategy of the communications network. Too often communications is viewed as “plug and play,” which is hardly ever the case. A well-thought-out communications strategy will deliver this kind of seamless operation. But without that initial strategy, the utility risks repeating past mistakes and acquiring an ad-hoc network that will rapidly become a legacy infrastructure, which will, in turn, need replacing.

Design. Outsourcing allows utilities to evolve their communications infrastructure without upfront investment in incremental resources and skills. It can delegate responsibility for defining network architecture and the associated network support systems. A utility may elect to leave all technological decisions to the vendor and merely review progress and outcomes. Or, it may retain responsibility for technology strategy, and turn to the managed services vendor to turn the strategy into architecture and manage the subsequent design and project activities.

Build. Detailed planning of the network, the rollout project and the delivery of turnkey implementations all fall within the scope of the outsourcing process.

Operate, Administer and Maintain. Includes network operations and field and support services:

  • Network Operations. A vendor such as Alcatel-Lucent has the necessary experience in operating Network Operations Centers (NOCs), both on a BOT and ongoing basis. This includes handling all associated tasks such as performance and fault monitoring, and services management.
  • Network and Customer Field Services. Today, few energy utilities consider outside maintenance and provisioning activities to be a strategic part of their business and recognize they are prime candidates for outsourcing. Activities that can be outsourced include corrective and preventive maintenance, network and service provisioning, and spare parts management, return and repair – in other words, all the daily, time-consuming, but vitally important elements for running a reliable network.
  • Network Support Services. Behind the first-line activities of the NOC are a set of engineering support functions that assist with more complex faults – these are functions that cannot be automated and tend to duplicate those of the vendor’s. The integration and sharing of these functions enabled by outsourcing can significantly improve the utility’s efficiency.

Conclusion

Outsourcing can deliver significant benefits to a utility, both in terms of its ability to invest in and improve its operation and associated costs. However, each utility has its own unique circumstances, specific immediate needs, and vision of where it is going. Therefore, each technical and operational solution is different.

Alcatel-Lucent Your Smart Grid Partner

Alcatel-Lucent offers comprehensive capabilities that combine Utility industry – specific knowledge and experience with carrier – grade communications technology and expertise. Our IP/MPLS Transformation capabilities and Utility market – specific knowledge are the foundation of turnkey solutions designed to enable Smart Grid and Smart Metering initiatives. In addition, Alcatel-Lucent has specifically developed Smart Grid and Smart Metering applications and solutions that:

  • Improve the availability, reliability and resiliency of critical voice and data communications even during outages
  • Enable optimal use of network and grid devices by setting priorities for communications traffic according to business requirements
  • Meet NERC CIP compliance and cybersecurity requirements
  • Improve the physical security and access control mechanism for substations, generation facilities and other critical sites
  • Offer a flexible and scalable network to grow with the demands and bandwidth requirements of new network service applications
  • Provide secure web access for customers to view account, electricity usage and billing information
  • Improve customer service and experience by integrating billing and account information with IP-based, multi-channel client service platforms
  • Reduce carbon emissions and increase efficiency by lowering communications infrastructure power consumption by as much as 58 percent

Working with Alcatel-Lucent enables Energy and Utility companies to realize the increased reliability and greater efficiency of next-generation communications technology, providing a platform for, and minimizing the risks associated with, moving to Smart Grid solutions. And Alcatel-Lucent helps Energy and Utility companies achieve compliance with regulatory requirements and reductions in operational expenses while maintaining the security, integrity and high availability of their power infrastructure and services. We build Smart Networks to support the Smart Grid.

American Recovery and Reinvestment Act of 2009 Support from Alcatel-Lucent

The American Recovery and Reinvestment Act (ARRA) of 2009 was adopted by Congress in February 2009 and allocates $4.5 billion to the Department of Energy (DoE) for Smart Grid deployment initiatives. As a result of the ARRA, the DoE has established a process for awarding the $4.5 billion via investment grants for Smart Grid Research and Development, and Deployment projects. Alcatel-Lucent is uniquely qualified to help utilities take advantage of the ARRA Smart Grid funding. In addition to world-class technology and Smart Grid and Smart Metering solutions, Alcatel-Lucent offers turnkey assistance in the preparation of grant applications, and subsequent follow-up and advocacy with federal agencies. Partnership with Alcatel-Lucent on ARRA includes:

  • Design Implementation and support for a Smart Grid Network
  • Identification of all standardized and unique elements of each grant program
  • Preparation and Compilation of all required grant application components, such as project narratives, budget formation, market surveys, mapping, and all other documentation required for completion
  • Advocacy at federal, state, and local government levels to firmly establish the value proposition of a proposal and advance it through the entire process to ensure the maximum opportunity for success

Alcatel-Lucent is a Recognized Leader in the Energy and Utilities Market

Alcatel-Lucent is an active and involved leader in the Energy and Utility market, with active membership and leadership roles in key Utility industry associations, including the Utility Telecom Council (UTC), the American Public Power Association (APPA), and Gridwise. Gridwise is an association of Utilities, industry research organizations (e.g., EPRI, Pacific Northwest National Labs, etc.), and Utility vendors, working in cooperation with DOE to promote Smart Grid policy, regulatory issues, and technologies (see www.gridwise.org for more info). Alcatel-Lucent is also represented on the Board of Directors for UTC’s Smart Network Council, which was established in 2008 to promote and develop Smart Grid policies, guidelines, and recommended technologies and strategies for Smart Grid solution implementation.

Alcatel-Lucent IP MPLS Solution for the Next Generation Utility Network

Utility companies are experienced at building and operating reliable and effective networks to ensure the delivery of essential information and maintain flawless service delivery. The Alcatel-Lucent IP/MPLS solution can enable the utility operator to extend and enhance its network with new technologies like IP, Ethernet and MPLS. These new technologies will enable the utility to optimize its network to reduce both CAPEX and OPEX without jeopardizing reliability. Advanced technologies also allow the introduction of new Smart Grid applications that can improve operational and workflow efficiency within the utility. Alcatel-Lucent leverages cutting edge technologies along with the company’s broad and deep experience in the utility industry to help utility operators build better, next-generation networks with IP/MPLS.

Alcatel-Lucent has years of experience in the development of IP, MPLS and Ethernet technologies. The Alcatel-Lucent IP/MPLS solution offers utility operators the flexibility, scale and feature sets required for mission-critical operation. With the broadest portfolio of products and services in the telecommunications industry, Alcatel-Lucent has the unparalleled ability to design and deliver end-to-end solutions that drive next-generation utility networks.

About Alcatel-Lucent

Alcatel-Lucent’s vision is to enrich people’s lives by transforming the way the world communicates. As a leader in utility, enterprise and carrier IP technologies, fixed, mobile and converged broadband access, applications, and services, Alcatel-Lucent offers the end-to-end solutions that enable compelling communications services for people at work, at home and on the move.

With 77,000 employees and operations in more than 130 countries, Alcatel-Lucent is a local partner with global reach. The company has the most experienced global services team in the industry, and Bell Labs, one of the largest research, technology and innovation organizations focused on communications. Alcatel-Lucent achieved adjusted revenues of €17.8 billion in 2007, and is incorporated in France, with executive offices located in Paris.

Successful Smart Grid Architecture

The smart grid is progressing well on several fronts. Groups such as the Grid Wise Alliance, events such as Grid Week, and national policy citations such as the American Recovery and Reinvestment Act in the U.S., for example, have all brought more positive attention to this opportunity. The boom in distributed renewable energy and its demands for a bidirectional grid are driving the need forward, as are sentiments for improving consumer control and awareness, giving customers the ability to engage in real-time energy conservation.

On the technology front, advances in wireless and other data communications make wide-area sensor networks more feasible. Distributed computation is certainly more powerful – just consider your iPod! Even architectural issues such as interoperability are now being addressed in their own forums such as Grid Inter-Op. It seems that the recipe for a smart grid is coming together in a way that many who envisioned it would be proud. But to avoid making a gooey mess in the oven, an overall architecture that carefully considers seven key ingredients for success must first exist.

Sources of Data

Utilities have eons of operational data: both real time and archival, both static (such as nodal diagrams within distribution management systems) and dynamic (such as switching orders). There is a wealth of information generated by field crews, and from root-cause analyses of past system failures. Advanced metering infrastructure (AMI) implementations become a fine-grained distribution sensor network feeding communication aggregation systems such as Silver Springs Network’s Utility IQ or Trilliant’s Secure Mesh Network.

These data sources need to be architected to be available to enhance, support and provide context for real-time data coming in from new intelligent electronic devices (IEDs) and other smart grid devices. In an era of renewable energy sources, grid connection controllers become yet another data source. With renewables, micro-scale weather forecasting such as IBM Research’s Deep Thunder can provide valuable context for grid operation.

Data Models

Once data is obtained, in order to preserve its value in a standard format, one can think in terms of an extensible markup language (XML)-oriented database. Modern implementations of these databases have improved performance characteristics, and the International Engineering Consortium (IEC) common information/ generic interface definition (CIM/GID) model, though oriented more to assets than operations, is a front-running candidate for consideration.

Newer entries, such as device language message specification – coincidence-ordered subsets expectation maximization (DLMS-COSEM) for AMI, are also coming into practice. Sometimes, more important than the technical implementation of the data, however, is the model that is employed. A well-designed data model not only makes exchange of data and legacy program adjustments easier, but it can also help the applicability of security and performance requirements. The existence of data models is often a good indicator of an intact governance process, for it facilitates use of the data by multiple applications.

Communications

Customer workshops and blueprinting sessions have shown that one of the most common issues needing to be addressed is the design of the wide-area communication system. Data communications architecture affects data rate performance, the cost of distributed intelligence and the identification of security susceptibilities.

There is no single communications technology that is suitable for all utilities, or even for all operational areas across any individual utility. Rural areas may be served by broadband over powerline (BPL), while urban areas benefit from multi-protocol label switching (MPLS) and purpose- designed mesh networks, enhanced by their proximity to fiber.

In the future, there could be entirely new choices in communications. So, the smart grid architect needs to focus on security, standardized interfaces to accept new technology, enablement of remote configuration of devices to minimize any touching of smart grid devices once installed, and future-proofing the protocols.

The architecture should also be traceable to the business case. This needs to include probable use cases that may not be in the PUC filing, such as AMI now, but smart grid later. Few utilities will be pleased with the idea of a communication network rebuild within five years of deploying an AMI-only network.

Communications architecture must also consider power outages, so battery backup, solar recharging, or other equipment may be required. Even arcane details such as “Will the antenna on a wireless device be the first thing to blow off in a hurricane?” need to be considered.

Security

Certainly, the smart grid’s purpose is to enhance network reliability, not lower its security. But with the advent of North American Reliability Corp. Critical Infrastructure Protection (NERC-CIP), security has risen to become a prime consideration, usually addressed in phase one of the smart grid architecture.

Unlike the data center, field-deployed security has many new situations and challenges. There is security at the substation – for example, who can access what networks, and when, within the control center. At the other end, security of the meter data in a proprietary AMI system needs to be addressed so that only authorized applications and personnel can access the data.

Service oriented architecture (SOA) appliances are network devices to enable integration and help provide security at the Web services message level. These typically include an integration device, which streamlines SOA infrastructures; an XML accelerator, which offloads XML processing; and an XML security gateway, which helps provide message-level, Web-services security. A security gateway helps to ensure that only authorized applications are allowed to access the data, whether an IP meter or an IED. SOA appliance security features complement the SOA security management capabilities of software.

Proper architectures could address dynamic, trusted virtual security domains, and be combined not only with intrusion protection systems, but anomaly detection systems. If hackers can introduce viruses in data (such as malformed video images that leverage faults in media players), then similar concerns should be under discussion with smart grid data. Is messing with 300 MegaWatts (MW) of demand response much different than cyber attacking a 300 MW generator?

Analytics

A smart grid cynic might say, “Who is going to look at all of this new data?” That is where analytics supports the processing, interpretation and correlation of the flood of new grid observations. One part of the analytics would be performed by existing applications. This is where data models and integration play a key role. Another part of the analytics dimension is with new applications and the ability of engineers to use a workbench to create their customized analytics dashboard in a self-service model.

Many utilities have power system engineers in a back office using spreadsheets; part of the smart grid concept is that all data is available to the community to use modern tools to analyze and predict grid operation. Analytics may need a dedicated data bus, separate from an enterprise service bus (ESB) or enterprise SOA bus, to meet the timeliness and quality of service to support operational analytics.

A two-tier or three-tier (if one considers the substations) bus is an architectural approach to segregate data by speed and still maintain interconnections that support a holistic view of the operation. Connections to standard industry tools such as ABB’s NEPLAN® or Siemens Power Technologies International PSS®E, or general tools such as MatLab, should be considered at design time, rather than as an additional expense commitment after smart grid commissioning.

Integration

Once data is sensed, securely communicated, modeled and analyzed, the results need to be applied for business optimization. This means new smart grid data gets integrated with existing applications, and metadata locked in legacy systems is made available to provide meaningful context.

This is typically accomplished by enabling systems as services per the classic SOA model. However, issues of common data formats, data integrity and name services must be considered. Data integrity includes verification and cross-correlation of information for validity, and designation of authoritative sources and specific personnel who own the data.

Name services addresses the common issue of an asset – whether transformer or truck – having multiple names in multiple systems. An example might be a substation that has a location name, such as Walden; a geographic information system (GIS) identifier such as latitude and longitude; a map name such as nearest cross streets; a capital asset number in the financial system; a logical name in the distribution system topology; an abbreviated logical name to fit in the distribution management system graphical user interface (DMS GUI); and an IP address for the main network router in the substation.

Different applications may know new data by association with one of those names, and that name may need translation to be used in a query with another application. While rewriting the applications to a common model may seem appealing, it may very well send a CIO into shock. While the smart grid should help propagate intelligence throughout the utility, this doesn’t necessarily mean to replace everything, but it should “information-enable” everything.

Interoperability is essential at both a service level and at the application level. Some vendors focus more at the service, but consider, for example, making a cell phone call from the U.S. to France – your voice data may well be code division multiple access (CDMA) in the U.S., travel by microwave and fiber along its path, and emerge in France in a global system for mobile (GSM) environment, yet your speech, the “application level data,” is retained transparently (though technology does not yet address accents!).

Hardware

The world of computerized solutions does not speak to software alone. For instance, AMI storage consolidation addresses the concern that the volume of data coming into the utility will be increasing exponentially. As more meter data can be read in an on-demand fashion, data analytics will be employed to properly understand it all, requiring a sound hardware architecture to manage, back-up and feed the data into the analytics engines. In particular, storage is needed in the head-end systems and the meter-data management systems (MDMS).

Head-end systems pull data from the meters to provide management functionality while the MDMS collects data from head-end systems and validates it. Then the data can be used by billing and other business applications. Data in both the head-end systems and the master copy of the MDMS is replicated into multiple copies for full back up and disaster recovery. For MDMS, the master database that stores all the aggregated data is replicated for other business applications, such as customer portal or data analytics, so that the master copy of the data is not tampered with.

Since smart grid is essentially performing in real time, and the electricity business is non-stop, one must think of hardware and software solutions as needing to be fail-safe with automated redundancy. The AMI data especially needs to be reliable. The key factors then become: operating system stability; hardware true memory access speed and range; server and power supply reliability; file system redundancy such as a JFS; and techniques such as FlashCopy to provide a point-in-time copy of a logical drive.

Flash Copy can be useful in speeding up database hot backups and restore. VolumeCopy can extend the replication functionality by providing the ability to copy contents of one volume to another. Enhanced remote mirroring (Global Mirror, Global Copy and Metro Mirror) can provide the ability to mirror data from one storage system to another, over extended distances.

Conclusion

Those are seven key ingredients for designing or evaluating a recipe for success with regard to implementing the smart grid at your utility. Addressing these dimensions will help achieve a solid foundation for a comprehensive smart grid computing system architecture.

Empowering the Smart Grid

Trilliant is the leader in delivering intelligent networks that power the smart grid. Trilliant provides hardware, software and service solutions that deliver on the promise of Advanced Metering and Smart Grid to utilities and their customers, including improved energy efficiency, grid reliability, lower operating cost, and integration of renewable energy resources.

Since its founding in 1985, the company has been a leading innovator in the delivery and implementation of advanced metering infrastructure (AMI), demand response and grid management solutions, in addition to installation, program management and meter revenue cycle services. Trilliant is focused on enabling choice for utility companies, ranging from meter, network and IT infrastructures to full or hybrid outsource models.

Solutions

Trilliant provides fully automated, two-way wireless network solutions and software for smart grid applications. The company’s smart grid communications solutions enable utilities to create a more efficient and robust operational infrastructure to:

  • Read meters on demand with five minute or less intervals;
  • Improve cash flow;
  • Improve customer service;
  • Decrease issue resolution time;
  • Verify outages and restoration in real time;
  • Monitor substation equipment;
  • Perform on/off cycle reads;
  • Conduct remote connect/disconnect;
  • Significantly reduce/eliminate energy theft through tamper detection; and
  • Realize accounting/billing improvements.

Trilliant solutions also enable the introduction of services and programs such as:

  • Dynamic demand response; and
  • Time-of-use (TOU), critical peak pricing (CPP) and other special tariffs and related metering.

Solid Customer Base

Trilliant has secured contracts for more than three million meters to be supported by its network solutions and services, encompassing both C&I and residential applications. The company has delivered products and services to more than 200 utility customers, including Duke Energy, E.ON US (Louisville Gas & Electric), Hydro One, Hydro Quebec, Jamaica Public Service Company Ltd., Milton Hydro, Northeast Utilities, PowerStream, Public Service Gas & Electric, San Diego Gas & Electric, Toronto Hydro Electric System Ltd., and Union Gas, among others.

The Distributed Utility of the (Near) Future

The next 10 to 15 years will see major changes – what future historians might even call upheavals – in the way electricity is distributed to businesses and households throughout the United States. The exact nature of these changes and their long-term effect on the security and economic well-being of this country are difficult to predict. However, a consensus already exists among those working within the industry – as well as with politicians and regulators, economists, environmentalists and (increasingly) the general public – that these fundamental changes are inevitable.

This need for change is in evidence everywhere across the country. The February 26, 2008, temporary blackout in Florida served as just another warning that the existing paradigm is failing. Although at the time of this writing, the exact cause of that blackout had not yet been identified, the incident serves as a reminder that the nationwide interconnected transmission and distribution grid is no longer stable. To wit: disturbances in Florida on that Tuesday were noted and measured as far away as New York.

A FAILING MODEL

The existing paradigm of nationwide grid interconnection brought about primarily by the deregulation movement of the late 1990s emphasizes that electricity be generated at large plants in various parts of the country and then distributed nationwide. There are two reasons this paradigm is failing. First, the transmission and distribution system wasn’t designed to serve as a nationwide grid; it is aged and only marginally stable. Second, political, regulatory and social forces are making the construction of large generating plants increasingly difficult, expensive and eventually unfeasible.

The previous historic paradigm made each utility primarily responsible for generation, transmission and distribution in its own service territory; this had the benefit of localizing disturbances and fragmenting responsibility and expense. With loose interconnections to other states and regions, a disturbance in one area or a lack of resources in a different one had considerably less effect on other parts of the country, or even other parts of service territories.

For better or worse, we now have a nationwide interconnected grid – albeit one that was neither designed for the purpose nor serves it adequately. Although the existing grid can be improved, the expense would be massive, and probably cost prohibitive. Knowledgeable industry insiders, in fact, calculate that it would cost more than the current market value of all U.S. utilities combined to modernize the nationwide grid and replace its large generating facilities over the next 30 years. Obviously, the paradigm is going to have to change.

While the need for dramatic change is clear, though, what’s less clear is the direction that change should take. And time is running short: North American Electric Reliability Corp. (NERC) projects serious shortages in the nation’s electric supply by 2016. Utilities recognize the need; they just aren’t sure which way to jump first.

With a number of tipping points already reached (and the changes they describe continuing to accelerate), it’s easy to envision the scenario that’s about to unfold. Consider the following:

  • The United States stands to face a serious supply/demand disconnect within 10 years. Unless something dramatic happens, there simply won’t be nearly enough electricity to go around. Already, some parts of the country are feeling the pinch. And regulatory and legislative uncertainty (especially around global warming and environmental issues) makes it difficult for utilities to know what to do. Building new generation of any type other than “green energy” is extremely difficult, and green energy – which currently meets less than 3 percent of U.S. supply needs – cannot close the growing gap between supply and demand being projected by NERC. Specifically, green energy will not be able to replace the 50 percent of U.S. electricity currently supplied by coal within that 10-year time frame.
  • Fuel prices continue to escalate, and the reliability of the fuel supply continues to decline. In addition, increasing restrictions are being placed on fuel selection, especially coal.
  • A generation of utility workers is nearing retirement, and finding adequate replacements among the younger generation is proving increasingly difficult.
  • It’s extremely difficult to site new transmission – needed to deal with supply-and-demand issues. Even new Federal Energy Regulatory Commission (FERC) authority to authorize corridors is being met with virulent opposition.

SMART GRID NO SILVER BULLET

Distributed generation – including many smaller supply sources to replace fewer large ones – and “smart grids” (designed to enhance delivery efficiency and effectiveness) have been posited as solutions. However, although such solutions offer potential, they’re far from being in place today. At best, smart grids and smarter consumers are only part of the answer. They will help reduce demand (though probably not enough to make up the generation shortfall), and they’re both still evolving as concepts. While most utility executives recognize the problems, they continue to be uncertain about the solutions and have a considerable distance to go before implementing any of them, according to recent Sierra Energy Group surveys.

According to these surveys, more than 90 percent of utility executives now feel that the intelligent utility enterprise and smart grid (IUE/SG) – that is, the distributed utility – represents an inevitable part of their future (Figure 1). This finding was true of all utility types supplying electricity.

Although utility executives understand the problem and the IUE/SG approach to solving part of it, they’re behind in planning on exactly how to implement the various pieces. That “planning lag” for the vision can be seen in Figure 2.

At least some fault for the planning lag can be attributed to forces outside the utilities. While politicians and regulators have been emphasizing conservation and demand response, they’ve failed to produce guidelines for how this will work. And although a number of states have established mandatory green power percentages, Congress failed to do the same in an Energy Policy Act (EPACT) adopted in December 2007. While the EPACT of 2005 “urged” regulators to “urge” utilities to install smart meters, it didn’t make their installation a requirement, and thus regulators have moved at different speeds in different parts of the country on this urging.

Although we’ve entered a new era, utilities remain burdened with the internal problems caused by the “silo mentality” left over from generations of tight regulatory control. Today, real-time data is often still jealously guarded in engineering and operations silos. However, a key component in the development of intelligent utilities will be pushing both real-time and back-office data onto dashboards so that executives can make real-time decisions.

Getting from where utilities were (and in many respects still are) in the last century to where they need to be by 2018 isn’t a problem that can be solved overnight. And, in fact, utilities have historically evolved slowly. Today’s executives know that technological evolution in the utility industry needs to accelerate rapidly, but they’re uncertain where to start. For example, should you install an advanced metering structure (AMI) as rapidly as possible? Do you emphasize automating the grid and adding artificial intelligence? Do you continue to build out mobile systems to push data (and more detailed, simpler instructions) to field crews who soon will be much younger and less experienced? Do you rush into home automation? Do you build windmills and solar farms? Utilities have neither the financial nor human resources to do everything at once.

THE DEMAND FOR AMI

Its name implies that a smart grid will become increasingly self-operating and self-healing – and indeed much of the technology for this type of intelligent network grid has been developed. It has not, however, been widely deployed. Utilities, in fact, have been working on basic distribution automation (DA) – the capability to operate the grid remotely – for a number of years.

As mentioned earlier, most theorists – not to mention politicians and regulators – feel that utilities will have to enable AMI and demand response/home automation if they’re to encourage energy conservation in an impending era of short supplies. While advanced meter reading (AMR) has been around for a long time, its penetration remains relatively small in the utilities industry – especially in the case of advanced AMI meters for enabling demand response: According to figures released by Sierra Energy Group and Newton-Evans Research Co., only 8 to 10 percent of this country’s utilities were using AMI meters by 2008.

That said, the push for AMI on the part of both EPACT 2005 and regulators is having an obvious effect. Numerous utilities (including companies like Entergy and Southern Co.) that previously refused to consider AMR now have AMI projects in progress. However, even though an anticipated building boom in AMI is finally underway, there’s still much to be done to enable the demand response that will be desperately needed by 2016.

THE AUTOMATED HOME

The final area we can expect the IUE/SG concept to envelope comes at the residential level. With residential home automation in place, utilities will be able to control usage directly – by adjusting thermostats or compressor cycling, or via other techniques. Again, the technology for this has existed for some time; however, there are very few installations nationwide. A number of experiments were conducted with home automation in the early- to mid-1990s, with some subdivisions even being built under the mantra of “demand-side management.”

Demand response – the term currently in vogue with politicians – may be considered more politically correct, but the net result is the same. Home automation will enable regulators, through utilities, to ration usage. Although politicians avoid using the word rationing, if global warming concerns continue to seriously impact utilities’ ability to access adequate generation, rationing will be the result – making direct load control at the residential level one of the most problematic issues in the distributed utility paradigm of the future. Are large numbers of Americans going to acquiesce calmly to their electrical supply being rationed? No one knows, but there seem to be few options.

GREEN PRESSURE AND THE TIPPING POINT

While much legitimate scientific debate remains about whether global warming is real and, if so, whether it’s a naturally occurring or man-made phenomenon (arising primarily from carbon dioxide emissions), that debate is diminishing among politicians at every level. The majority of politicians, in fact, have bought into the notion that carbon emissions from many sources – primarily the generation of electricity by burning coal – are the culprit.

Thus, despite continued scientific debate, the political tipping point has been reached, and U.S. politicians are making moves to force this country’s utility industry to adapt to a situation that may or may not be real. Whether or not it makes logical or economic sense, utilities are under increasing pressure to adopt the Intelligent Utility/Smart Grid/Home Automation/Demand Response model – a model that includes many small generation points to make up for fewer large plants. This political tipping point is also shutting down more proposed generation projects each month, adding to the likely shortage. Since 2000, approximately 50 percent of all proposed new coal-fired generation plants have been canceled, according to energy-industry adviser Wood McKenzie (Gas and Power Service Insight, February 2008).

In the distant future, as technology continues to advance, electric generation in the United States will likely include a mix of energy sources, many of them distributed and green. however, there’s no way that in the next 10 years – the window of greatest concern in the NERC projections on the generation and reliability side – green energy will be ready and available in sufficient quantities to forestall a significant electricity shortfall. Nuclear energy represents the only truly viable solution; however, ongoing opposition to this form of power generation makes it unlikely that sufficient nuclear energy will be available within this period. The already-lengthy licensing process (though streamlined somewhat of late by the Nuclear Regulatory Commission) is exacerbated by lawsuits and opposition every step of the way. In addition, most of the necessary engineering and manufacturing processes have been lost in the United States over the last 30 years – the time elapsed since the last U.S. nuclear last plant was built – making it necessary to reacquire that knowledge from abroad.

The NERC Reliability Report of Oct. 15, 2007, points strongly toward a significant shortfall of electricity within approximately 10 years – a situation that could lead to rolling blackouts and brownouts in parts of the country that have never experienced them before. It could also lead to mandatory “demand response” – in other words, rationing – at the residential level. This situation, however, is not inevitable: technology exists to prevent it (including nuclear and cleaner coal now as well as a gradual development of solar, biomass, sequestration and so on over time, with wind for peaking). But thanks to concern over global warming and other issues raised by the environmental community, many politicians and regulators have become convinced otherwise. And thus, they won’t consider a different tack to solving the problem until there’s a public outcry – and that’s not likely to occur for another 10 years, at which point the national economy and utilities may already have suffered tremendous (possibly irreparable) harm.

WHAT CAN BE DONE?

The problem the utilities industry faces today is neither economic nor technological – it’s ideological. The global warming alarmists are shutting down coal before sufficient economically viable replacements (with the possible exception of nuclear) are in place. And the rest of the options are tied up in court. (For example, the United States needs 45 liquefied natural gas plants to be converted to gas – a costly fuel with iffy reliability – but only five have been built; the rest are tied up in court.) As long as it’s possible to tie up nuclear applications for five to 10 years and shut down “clean coal” plants through the political process, the U.S. utility industry is left with few options.

So what are utilities to do? They must get much smarter (IUE/Sg), and they must prepare for rationing (AMI/demand response). As seen in SEG studies, utilities still have a ways to go in these areas, but at least this is a strategy that can (for the most part) be put in place within 10 to 15 years. The technology for IUE/Sg already exists; it’s relatively inexpensive (compared with large-scale green energy development and nuclear plant construction); and utilities can employ it with relatively little regulatory oversight. In fact, regulators are actually encouraging it.

For these reasons, IUE/SG represents a major bridge to a more stable future. Even if today’s apocalyptic scenarios fail to develop – that is, global warming is debunked, or new generation sources develop much more rapidly than expected – intelligent utilities with smart grids will remain a good idea. The paradigm is shifting as we watch – but will that shift be completed in time to prevent major economic and social dislocation? Fasten your seatbelts: the next 10 to 15 years should be very interesting!

Intelligent Communications Platform Provides Foundation for Clean Technology Solutions to Smart Grid

Since the wake-up call of the 2003 blackout in the northeastern United States and Canada, there’s been a steady push to improve the North American power grid. Legislation in both the United States and Canada has encouraged investments in technologies intended to make the grid intelligent and to solve critical energy issues. The Energy Policy Act (EPAct) of 2005 mandated that each state evaluate the business case for advanced metering infrastructure (AMI). In Ontario, the Energy Conservation Responsibility Act of 2006 mandated deployment of smart meters to all consumers by 2010. And the recent U.S. Energy Independence and Security Act of 2007 expands support from the U.S. government for investments in smart grid technologies while further emphasizing the need for the power industry to play a leadership role in addressing carbon dioxide emissions affecting climate change.

Recent state-level legislation and consumer sentiment suggest an increasing appetite for investments in distributed clean-technology energy solutions. Distributed generation technologies such as solar, wind and bio-diesel are becoming more readily available and have the potential to significantly improve grid operations and reliability.

THE NEXT STEP

Although the full vision for the smart grid is still somewhat undefined, most agree that an intelligent communications platform is a necessary foundation for developing and realizing this vision. Of the 10 elements that define the smart grid as contained within the Energy Act of 2007, more than half directly relate to or involve advanced capabilities for advanced communications.

A core business driver for intelligent communications is full deployment of smart metering, also referred to as advanced metering infrastructure. AMI involves automated measurement of time-of-use energy consumption – at either hourly or 15-minute intervals – and provides for new time-of-use rates that encourage consumers to use energy during off-peak hours when generation costs are low rather than peak periods when generation costs are high and the grid is under stress. With time-of-use rates, consumers may continue to use power during high peak periods but will pay a higher price to do so. AMI may also include remote service switch functionality that can reduce costs associated with site visits otherwise required to manage move-out/move-ins or to support prepayment programs.

Other smart grid capabilities that may be easily realized through the deployment of intelligent communications and AMI include improved outage management detection and restoration monitoring, revenue assurance and virtual metering of distribution assets.

CRITICAL ATTRIBUTES OF AMI SOLUTIONS

Modern communications network solutions leverage standards-based technology such as IEEE 802.15.4 to provide robust two-way wireless mesh network communications to intelligent devices. The intelligent communications platform should provide for remote firmware upgrades to connected intelligent devices and be capable of leveraging Internet protocol-based communications across multiple wide-area network (WAN) options (Figure 1).

Critical for maximizing the value of a communications infrastructure investment is support for broad interoperability and interconnectivity. Interoperability for AMI applications means supporting a range of options for metering devices. A communications platform system should be meter manufacturer-independent, empowering choice for utilities. This provides for current and future competitiveness for the meter itself, which is one of the more expensive elements of the smart metering solution.

Interconnectivity for communications platforms refers to the ability to support a broad range of functions, both end-point devices and systems at the head end. To support demand-side management and energy-efficiency initiatives, an intelligent communications platform should support programmable communicating thermostats (PCTs), in-home displays (IHDs) and load control switches.

The system may also support standards-based home-area networks (HANs) such as ZigBee and Zensys. Ultimately an intelligent communications platform should support a model whereby third-party manufacturers can develop solutions that operate on the network, providing competitive options for utilities.

For enterprise system interconnectivity, an AMI demand-side management or other smart grid head-end application should be developed using service-oriented architecture (SOA) principles and Web technologies. These applications should also support modern Web services-based solutions, providing published simple object access protocol (SOAP)-based APIs. This approach provides for easier integration with existing enterprise systems and simplifies the process of adding functionality (either through enhancements provided by the vendor or add-ons delivered by third parties or developed by the utility).

Finally, the value of an intelligent communications platform deployment is driven by the ability of other enterprise applications and processes to utilize the vast amount of new data received through the AMI , demand side management and smart grid applications. Core areas of extended value include integration with customer information systems and call center processes, and integration with outage management and work management systems. In addition, the intelligent communications platform makes utilities much better able to market new offerings to targeted customers based on their energy consumption profiles while also empowering consumers with new tools and access to information. The result: greater control over energy consumption costs and improved satisfaction.

INTEGRATION OF DISTRIBUTED GENERATION RESOURCES

Deployment and integration of distributed generation, including renewable resources, is an important supply-side element of the smart grid vision. This may include the installation of arrays of solar photovoltaic panels on home and office roofs, solar carports, small wind (3-5kvA) turbines, small biogas turbines and fuel cells.

By integrating these resources into a common communications platform, utilities have the opportunity to develop solutions that achieve much greater results than those provided simply by the sum of independent systems. For example, intelligent plug-in hybrid electric vehicles (PHEvs) connected to a smart solar carport may choose when to purchase power for charging the car or even to sell power back to the grid in a vehicle-to-grid (v2G) model based on dynamic price signals received through the communications platform. By maintaining intelligence at the edge of the grid, consumers and distributed resource owners can be empowered to manage to their own benefits and the grid as a whole.

SUMMARY

Now is the time to embark on realizing the smart grid vision. Global warming and system reliability issues are driving a sense of urgency. An intelligent communications platform provides a foundation capable of supporting multiple devices in multiple environments – commercial, industrial and residential – working seamlessly together in a single unified network.

All of the technical assets of a smart grid can be managed holistically rather than as isolated or poorly connected parts. The power of a network grows geometrically according to the amount of resources and assets actively connected to it. This is the future of the smart grid, and it’s available today.

Smart Meters on a Roll in Canada

Electricity supply challenges in Ontario, Canada, have led the provincial government there to take aggressive action on both the supply and demand sides to meet customer electricity needs. Between now and 2025, it’s estimated that Ontario must build an almost entirely new electricity system – including replacing approximately 80 percent of current generating facilities (as they’re retired over time) and expanding the system to meet future growth. However, just as building new supply is vital, so too is conservation. That’s why Ontario’s provincial government is introducing new tools like smart meters to encourage electricity consumers to think more about how and when they use electricity. By implementing a smart metering infrastructure by 2010, the province hopes to provide a foundation for achieving a more than five percent reduction in provincial demand through load shifting, energy savings and price awareness.

Hydro One owns and operates one of the 10 largest transmission and distribution systems in North America, serving a geographic area of about 640,000 square kilometers. As the leading electricity transmitter and distributor in Ontario, the company supports the province’s goal of creating a conservation culture in Ontario and having a smart meter in every Ontario home and small business. The company’s allocation of the province’s target was 240,000 smart meters by 2007 and the full 1.3 million by 2010.

The task for Hydro One and other local distribution companies (LDCs) in the province is to meet the government time line while at the same time building an enabling solution that provides the most upside for operations, demand management and customer satisfaction. Working with the industry regulator and the LDCs, phased goals were established and allocated among the major utilities in the province.

ADVANCED METERING INFRASTRUCTURE AND SOLUTION ARCHITECTURE

Advanced metering infrastructure (AMI) is the term used to describe all of the hardware, software and connectivity required for a fully functioning smart metering system. To view AMI as just a technology to remotely read meters and bill customers, however, would be to miss the full potential of smart metering.

The core of the solution resides with the requirement for a ubiquitous communications network and an integration approach that provides for the exploitation of data from many types of devices (automated meter reading, load control, in-home displays, distribution monitoring and control and so on) by making it available to numerous enterprise applications (for example, customer information, outage management, asset management, geographic information and work execution systems).

To meet this requirement, the Hydro One team architected an end-to-end solution that rigorously sought open standards and the use of IP at all communications levels to ensure that the network and integration would be available to and compatible with numerous applications.

Hydro One’s AMI solution is based on standards (ANSI and IEEE) and open protocols (Zigbee and IP) to ensure maximum flexibility into the future as the technology and underlying applications such as in-home energy conservation devices (two-way real-time monitors, pool pump timers and so on) and various utility applications evolve.

Smart Meters

The “smarts” in any smart meter can be housed in virtually any meter platform. Meter reads are communicated at a frequency of 2.4 GHz by a radio housed under the meter’s glass. In essence, the hourly meter reads are transmitted by hopping from one meter to the next, forming a self-organizing network that culminates at the advanced meter regional collector (AMRC). This type of local area network, or LAN, is known as a mesh network and is known for its self-healing characteristics: if communication between meters is interrupted for any reason, communication paths between meters are automatically rerouted to the regional collector to ensure that data is delivered reliably and on time. The installed smart meters also have a “super capacitor,” enabling the meter to send a last communication to the utility when there has been a power outage.

Repeaters

Repeaters provide a wireless range extender for the meters and are used in less densely populated areas in the province to allow data to be transmitted from one meter to the next. Typically, repeaters are needed if the hop between meters is greater than 1 to 2 kilometers (depending on a number of factors, including terrain and ground cover).

Advanced Metering Regional Collectors

Typically installed on poles at preselected locations within a local area network, advanced metering regional collectors (AMRCs) gather the meter readings in a defined area. Most importantly, the AMRCs provide access to the wide area network (WAN), where data is sent wirelessly back to Hydro One. The AMI solution is designed to accommodate either wireless cellular or broadband WAN to backhaul hourly meter reads to the advanced metering control computer.

Advanced Metering Control Computer

The advanced metering control computer (AMCC) is used to retrieve and temporarily store meter reads from the regional collectors before they’re transmitted to the meter data management repository (discussed below). The information stored in the AMCC is available to log maintenance and data transmission faults, and to issue reports on the overall health of the AMI system.

Meter Data Management Repository

MDM/R is the acronym for the province-wide meter data management repository. The MDM/R provides a common infrastructure for receiving meter reads from all LDCs in Ontario, processing the reads to produce billing quality consumption data, and storing and managing the data. The Ontario government has entered into an agreement with the Independent Electricity System Operator to coordinate and manage implementation activities associated with the MDM/R.

Billing

Time-of-use “bucketed” data is sent from the MDM/R to Hydro One for any exception handling that may be required and for customer billing. Hydro One prepares the bill and sends it to the customer for payment.

Web Presentment of Customer Usage Data

Customer electricity usage data will be available to customers by 9 a.m. the day after they use it via a secure website. This data will be clearly marked as preliminary data until the customer has been billed.

GOALS, OBJECTIVES AND KEY ACCOMPLISHMENTS

To successfully deploy the smart metering solution described above, the Hydro One team set out to accomplish the following goals and objectives (which are enshrined in project governance plans and daily project activities):

  • Balance investment with the regulatory process to ensure that smart meter investments don’t get ahead of changes in regulatory requirements.
  • Design, test, prototype and pilot prior to buying or building – a rule that applies to all aspects of the smart meter solution architecture, from the meters and communication network to the back-office systems.
  • Delay building solution components until line-of-business requirements are locked down. Solution components that are unlikely to change will be built before other components to minimize the risk of rework.
  • Test smart meter deployment business processes, technology and customer experience throughout the process.
  • Ensure positive customer experience and value, including providing customers with information and tools to leverage smart meters in an appropriate time frame.
  • Use commercial, off-the-shelf (COTS) products where possible (as opposed to custom solutions).
  • Include estimation of total cost of ownership (one-time and ongoing costs) in architectural decision making.
  • Enable commencement of time-of-use (TOU) billing in 2009.

Key project accomplishments to date have included:

  • Building an in-situ lab using WiMax and meters in rural areas to test and confirm open protocols, wireless broadband interoperability, and meter performance;
  • Conducting a community rollout of about 15,000 meters to develop and successfully test and optimize meter change automation tools and customer communication processes;
  • Mass deploying of just over a quarter of a million meters across the province;
  • Designing and beginning to build the communication network to support the collection of hourly reads from approximately 1.3 million customers.

METER AND NETWORK DEPLOYMENT

Meter installation teams surpassed a notable milestone of 250,000 installed smart meters as of December 2007. Network deployment began in 2007 with a planned ramp-up in 2008 of installing more than 2,000 AMRCs province-wide.

Meeting these targets has required well-coordinated activities across the project team while working in parallel with external entities such as MeasurementCanada and others to ensure compliance with regulatory requirements.

Throughout meter and network deployment activities, Hydro One has adhered closely to three primary guiding principles, namely:

Safety. The following initiatives were factored into the project to help maintain a safe environment for all employees and business partners:

  • Internal training was integrated into the project from the inception, establishing a thorough yet common-sense compliant safety attitude throughout the team.
  • No employee is permitted to work on the project without a full safety refresher.
  • Safety represented a key element of incentive compensation for management and executive personnel.

Customer service. Given the opportunity to visit literally every customer, the success of this project is being judged daily by the manner in which the project team interacts with customers.

  • Every customer is provided with an information package within 15 to 30 days of the meter change.
  • Billing windows are scrupulously avoided through automation tools and integration to CIS in order to eliminate any disruption to the size, look and feel of the customer bill.
  • All customers receive a personal knock at the door before meter change.
  • All life-safety customers are changed by appointment or have positive contact made prior to meter change if they cannot be reached for an appointment

Productivity. Despite Hydro One’s rural footprint – which includes some areas so remote they must be accessed by all-terrain vehicle, boat or snowmobile – the installation teams maintain an average of 39.6 meters per installer-day with a peak of 97 per installer-day. They have achieved this through automation and a phased ramp-up of installers, including training and joint fieldwork with Hydro One’s partners.

IN-HOME CONSERVATION AND DEMAND MANAGEMENT

Testing will soon be underway using third-party devices for residential demand response programs that operate on the mesh network, including two-way realtime monitors, automated thermostats and load control devices. Optimally for customers, meters will serve as the key head-end device, connectable to numerous other devices within the home as illustrated in Figure 2.

While much of this technology is still in its infancy, North America-wide AMI deployments will rapidly accelerate resulting in greatly enhanced customer service opportunities going forward.

LEVERAGING THE SMART NETWORK TO INCREASE UTILITY EFFICIENCY

Hydro One is also looking ahead to applications that will leverage the smart metering communication network to increase the efficiency of its operations. As illustrated in Figure 3, these applications include distribution station monitoring, enhancements to outage management, safety monitoring, mobile work dispatch and work accomplishment, and asset security. All of the above applications have been tested in a proof-of-concept environment, and individual projects are planned to proceed on a business case basis.

Opportunity Ahead: The Aging Workforce

Conventional thinking has it that the utility industry’s aging workforce represents a critical problem demanding a call to arms. But is an aging workforce really just a human resources dilemma? Or can it be viewed more broadly as a window through which utilities can examine ways to foster positive change for the future of their organizations? When viewed in this light, the exit of a large cohort of skilled workers may represent the most significant opportunity a utility will ever confront – one that could fundamentally alter the way it does business and upgrades financial performance.

At most utilities, little or no opportunity exists for significant revenue growth (a situation that’s persisted for some time) at the same time that personnel-related expenses have continued to increase and squeeze profit margins. To achieve the annual earnings improvement targets of 10 to 15 percent that stakeholders have come to expect, utilities have had no alternative but to reduce ongoing operational expenses dramatically – and often that’s meant cutting staff.

But the days of dramatic expense cuts based on typical cost reduction strategies are all but over. With nearly a third of the industry eligible to retire today, further personnel cuts aren’t warranted. Utilities are now confronted with a unique opportunity to make business improvements to reduce future costs. One approach involves using innovative technology to:

  • Lessen headcount requirements and make better use of reduced staffs;
  • Capture the knowledge base of skilled workers before they depart the workforce;
  • Reduce the number of people required to carry out a task by improving data access and communications among operating units;
  • Emphasize availability and use of key skills (rather than number of personnel);
  • Create true “best practices” (rather than continue to rely on “status quo practices”); and
  • Develop a “digital organization” that excites and retains new hires.

The utilities that will be successful in the future – the high-performance utilities – won’t hire their way to success. After all, there will be fewer skilled workers available for hire; recruitment will remain costly; and ongoing personnel-related expenses will continue to escalate. Instead, the high-performance utility will institutionalize its key procedures and business processes (by capturing existing employee knowledge) and exploit documented best practices before employees fly out the door.

Forward-looking utilities must invest in strategic technology, using a variety of partner models to meet their requirements. Technology solutions that solve localized issues will not address the future. Solutions that are able to look at a utility horizontally – as an organization with many parts that need to perform as a single entity – will serve as an important means of dealing with the disappearing workforce.

WHAT ARE UTILITIES LOSING … AND GAINING?

The imminent loss of critical skills and knowledge base caused by an aging workforce approaching retirement represents a demographic tsunami – a force unprecedented in business history. During the next five to 10 years, many utilities will lose as much as 50 percent of their current workforce to retirement. Clerical and administrative staff, as well as field technicians, managers and supervisors, engineers, IT personnel and business executives will all be part of the retirement wave.

The effect of utility workforce retirement is more profound than simple personnel turnover, because it represents a loss of critical knowledge. This knowledge base embodies the art of the organization – not just the information documented in manuals, maps, procedures and databases but also the organization’s culture and attitudes.

As younger workers replace an aging and departing workforce, utilities could witness the fracture of the motivational belief system that once bound the workforce. To meet the utility’s objectives, new workers need to have access to the expertise and knowledge of prior generations of workers. They can then build on this knowledge with their own experiences, helping the utility achieve a new and positive culture for success.

CONVENTIONAL SOLUTIONS

Industry literature suggests a number of solutions to the aging utility workforce problem:

  • Long-term staffing plans;
  • Partnerships with universities and community colleges;
  • Continuing education and training programs;
  • Active involvement in industry organizations; and
  • Internal knowledge sharing programs.

Each of these approaches plays a role in the solution, but collectively they still fall short of truly lessening the impact of the loss of half (or more) of a utility’s workforce. To wit: the number of students enrolled in college math and science programs (with the exception of computer and information science) continues to decline. And in the last 15 years, colleges and universities have seen a 50 percent decline in the number of graduating engineers (one of many skill sets a utility requires). All of which means that as utilities lose their skilled workers, they will not be able to replace those skills by drawing from the current labor pool. Solutions other than hiring programs will be needed to bridge the gap between skills lost and skills needed.

THE ROLE OF TECHNOLOGY

Much of the technology utilities have implemented over the past five to 10 years has taken the form of “point” software solutions. By solving specific and limited problems, this software has tended to reinforce status quo business practices rather than enable innovation or better problem solving.

In many utilities, status quo means a vertical organization – a group of departmental silos that define the utility’s corporate structure. In a vertical structure, each group or department operates as a somewhat isolated entity, and each group “owns” the work to which it is assigned. But the manner in which utilities conduct business is comprised of horizontal processes spanning the office and the field – processes that are driven by the customer, whether commercial, industrial or residential.

Thus, vertical organizations often inhibit the type of change that can reduce headcount requirements and ensure better communication between remaining personnel. But changes that help flatten an organization horizontally – so that operations and procedures are viewed from end to end – can streamline business processes to improve handoffs between job roles and eliminate time-consuming and labor-intensive administration steps.

In the future, high-performance utilities will of necessity implement horizontal business process solutions that involve multiple systems spanning former organizational silos such as customer service and distribution operations. Horizontal solutions represent a quantum change in project complexity that will stretch many utilities’ internal organizations and define the systems integration market in the future.

The major opportunity offered by an integrated, horizontal solution lies in the creation of a strategic technology platform that offers the benefits of positive change and value creation. Such changes will be critical in supporting a utility as it undergoes workforce attrition and cultural evolution due to workforce retirements. The following represent some of the opportunities for change that high-performance utilities should be reviewing.

Business Process Change Opportunities

The term best practices has sometimes been defined as a generic methodology or a detailed scripting of events rather than an organized, documented view of the preferred and streamlined way to carry out a particular procedure. Many major technology initiatives and systems implementations have failed to deliver value to the utility because the true “best” practice is never defined, and therefore the transformation of the business process never occurs. The pressure to reduce costs and the rush to adopt scripts of existing procedures are the primary reasons for this disappointment.

The high-performance utility of the future, then, must commit to accurately defined best practices and a program of continuous process improvement. Such programs reduce costs by simplifying and standardizing business processes, eliminating paperwork and redundant data, reducing personnel interface points and viewing a utility’s operations from office to field as a single continuum. A strong strategic technology platform can support the capture and reinforcement of these standards.

Design Engineering Opportunities

The average investor-owned utility in North America has more than 50 design engineers architecting construction work undertaken by the utility. The design of such work involves significant systems support, including a geographic information system (GIS) and a graphical work design interface that links the GIS to a work management system.

Much of the construction work and underlying design work undertaken by utilities is repetitive. This type of repetitive work – particularly for light or medium construction activities – lends itself to design templates. In fact, design templates could accommodate as much as 80 percent of the design engineering workload. The development of a best practice based on standard designs for discrete types of work (and institutionalizing a standard design as a replicable template for the engineering department) can reduce a utility’s dependence on an increasingly limited supply of talented engineering labor.

Scheduling and Dispatching Opportunities

The average investor-owned utility (IOU) in North America has more than 700 field crews serving trouble response, customer service, maintenance and construction activities. Although job function definitions and responsibilities vary among utilities, the roles that manage the deployment of field crews may be defined as 1) schedulers; 2) dispatchers; 3) administrative personnel; and 4) field supervisors. All of these individuals may actively schedule or dispatch the field workforce, even within the same utility.

The same average IOU also has as many as 60 full-time employees (approximately one for every 12 field crews) involved in scheduling, dispatching, monitoring and providing administrative support to the field workforce. The staff handling these tasks is often functionally, organizationally and geographically dispersed – thanks largely to the point software mobile applications that mirror the organizational silos that acquired the applications. Typically, each piece of software addresses one job type: emergencies, customer service, maintenance or construction. Accordingly, each department employs multiple staff to schedule and/or dispatch each type of job.

This kind of environment spells opportunity for utilities facing shrinking workforces, since a single scheduling and dispatching technology can have immense cost-reduction implications (including reducing redundant job roles.)

The scheduling of field personnel can also be worked into a single dispatch strategy. Utilities need a unified method of work allocation – a kind of utility command and control center for scheduling and dispatching all work. The right strategic technology platform incorporates significant business intelligence, understands job dependencies, employs least-cost routing and continually provides the user with an optimized schedule throughout the workday. As the scheduling software assumes more of the scheduling responsibility, the 60 full-time employees formerly required by an average utility become unnecessary, thereby eliminating a major staffing concern.

Wireless Opportunities

For the last two years in North America, utilities have issued more RFPs for mobile workforce management than any other application domain. All of the top 100 North American IOUs employ some form of mobile deployment. However, these applications are point software solutions that address one job type, such as trouble reporting; they do not currently support a horizontal dispatching and scheduling function. Furthermore, many utilities lack an overarching, dedicated wireless strategy to fully mobilize the workforce.

Utilities require a plug-and-play wireless communications architecture that 1) manages the fl ow of data between office and field; 2) maximizes the bandwidth and throughput of existing utility RF radio, wire line and wireless networks; 3) assigns priorities to time-sensitive data; and 4) provides least-cost routing (network choice). This represents a complex undertaking – and one that no utility has yet mastered. There is no generic plug-and-play platform that manages field workforces in this way. Indeed, a universal communications platform (dispatch) that manages all types of work has been the holy grail of the network connectivity business. No utility has this capability today.

Once it is achieved, however, a universal architecture will allow the utility to plug-and-play back-office and mobile applications to broaden the footprint of work conducted wirelessly in the field. A universal mobile application controller that manages all types of work will power the future of mobile computing for the industry – but no utility has this capability today. In addition to application and network independence, the utility’s wireless enterprise strategy must accommodate the management of multiple field devices, and the supporting server and communications hardware/middleware environment.

An integrated universal communications platform must be viewed as the next technology that will enable utilities to lessen their dependence on headcount. The technologies that support such a platform are being created now; in order to blunt the impact of a disappearing workforce, high-performance utilities need to begin partnering with systems integrators that can bring these technologies to the table.

THE FUTURE OF TECHNOLOGY: SOLUTION OPTIMIZATION

The next significant strategic technologies implemented by utilities will be those that optimize solutions and processes. These systems will help the utility institutionalize the knowledge of seasoned employees and incorporate that knowledge into documented, sustainable best practices. In addition, new strategic technologies will help the utility evolve best practices over time through a program of continuous process improvement. Furthermore, these new technologies will provide the utility with ways to most effectively use both new and existing applications to perform work across the entire horizontal utility organization.

Instead of tactically buying enabling technology such as software, utilities will strategically partner with organizations that can deliver technology that creates value within the utility. Utilities will increasingly seek partners who own the business result, not simply the process or the IT infrastructure. Such partners will share utility risk and reward in a program of continuous process improvement, as they and the utility constantly refine and optimize solutions.

CONCLUSION

What will the high-performance utility look like in 10 years? For starters, it will have fewer employees and more new faces. It will have lost much of the culture it relied on to drive its business forward. But if it makes the right plans today, it will ultimately gain a new culture that takes advantage of the best of the old knowledge combined with the advantages of a new strategic technology platform. The new platform will unite all segments of utility operations within a single set of business goals. A workforce that is disappearing due to retirement doesn’t need to spell disaster if a utility takes steps now. These steps include applying conventional hiring approaches, embracing new technology and seeking out vendor partnerships to help unite and optimize the utility’s work processes.

Policy and Regulatory Initiatives And the Smart Grid

Public policy is commonly defined as a plan of action designed to guide decisions for achieving a targeted outcome. In the case of smart grids, new policies are needed if smart grids are actually to become a reality. This statement may sound dire, given the recent signing into law of the 2007 Energy Independence and Security Act (EISA) in the United States. And in fact, work is underway in several countries to encourage smart grids and smart grid components such as smart metering. However, the risk still exists that unless stronger policies are enacted, grid modernization investments will fail to leverage the newer and better technologies now emerging, and smart grid efforts will never move beyond demonstration projects. This would be an unfortunate result when you consider the many benefits of a true smart grid: cost savings for the utility, reduced bills for customers, improved reliability and better environmental stewardship.

REGIONAL AND NATIONAL EFFORTS

As mentioned above, several regions are experimenting with smart grid provisions. At the national level, the U.S. federal government has enacted two pieces of legislation that support advanced metering and smart grids. The Energy Policy Act of 2005 directed U.S. utility regulators to consider time-of-use meters for their states. The 2007 EISA legislation has several provisions, including a list of smart grid goals to encourage two-way, real-time digital networks that stretch from a consumer’s home to the distribution network. The law also provides monies for regional demonstration projects and matching grants for smart grid investments. The EISA legislation also mandates the development of an “interoperability framework.”

In Europe, the European Union (E.U.) introduced a strategic energy technology plan in 2006 for the development of a smart electricity system over the next 30 years. The European Technology Platform organization includes representatives from industry, transmission and distribution system operators, research bodies and regulators. The organization has identified objectives and proposes a strategy to make the smart grid vision a reality.

Regionally, several U.S. states and Canadian provinces are focused on smart grid investments. In Canada, the Ontario Energy Board has mandated smart meters, with meter installation completion anticipated by 2010. In Texas, the Public Utilities Commission of Texas (PUCT) has finalized advanced metering legislation that authorizes metering cost recovery through surcharges. The PUCT also stipulated key components of an advanced metering system: two-way communications, time-date stamp, remote connect/disconnect, and access to consumer usage for both the consumer and the retail energy provider. The Massachusetts State Senate approved an energy bill that includes smart grid and time-of-use pricing. The bill requires that utilities submit a plan by Sept. 1, 2008, to the Massachusetts Public Utilities Commission, establishing a six-month pilot program for a smart grid. Most recently, California, Washington state and Maryland all introduced smart grid legislation.

AN ENCOMPASSING VISION

While these national and regional examples represent just a portion of the ongoing activity in this area, the issue remains that smart grid and advanced metering pilot programs do not guarantee a truly integrated, interoperable, scalable smart grid. Granted, a smart grid is not achieved overnight, but an encompassing smart grid vision should be in place as modernization and metering decisions are made, so that investment is consistent with the plan in mind. Obviously, challenges – such as financing, system integration and customer education – exist in moving from pilot to full grid deployment. However, many utility and regulatory personnel perceive these challenges to be ones of costs and technology readiness.

The costs are considerable. KEMA, the global energy consulting firm, estimates that the average cost of a smart meter project (representing just a portion of a smart grid project) is $775 million. The E.U.’s Strategic Energy Technology Plan estimates that the total smart grid investment required could be as much as $750 billion. These amounts are staggering when you consider that the market capitalization of all U.S. investor-owned electric utilities is roughly $550 billion. However, they’re not nearly as significant when you subtract the costs of fixing the grid using business-as-usual methods. Transmission and distribution expenditures are occurring with and without intelligence. The Energy Information Administration (EIA) estimates that between now and 2020 more than $200 billion will be spent to maintain and expand electricity transmission and distribution infrastructures in the United States alone.

Technology readiness will always be a concern in large system projects. Advances are being made in communication, sensor and security technologies, and IT. The Federal Communications Commission is pushing for auctions to accelerate adoption of different communication protocols. Price points are decreasing for pervasive cellular communication networks. Electric power equipment manufacturers are utilizing the new IEC 61850 standard to ensure interoperability among sensor devices. vendors are using approaches for security products that will enable north American Electric Reliability Corp. (nERC) and critical infrastructure protection (CIP) compliance.

In addition, IT providers are using event-driven architecture to ensure responsiveness to external events, rather than processing transactional events, and reaching new levels with high-speed computer analytics. leading service-oriented architecture companies are working with utilities to establish the underlying infrastructure critical to system integration. Finally, work is occurring in the standards community by the E.U., the Gridwise Architecture Council (GAC), Intelligrid, the national Energy Technology laboratory (nETl) and others to create frameworks for linking communication and electricity interoperability among devices, systems and data flows.

THE TIME IS NOW

These challenges should not halt progress – especially when one considers the societal benefits. Time stops for no one, and certainly in the case of the energy sector that statement could not be more accurate. Energy demand is increasing. The Energy Information Administration estimates that annual energy demand will increase roughly 50 percent over the next 25 years. Meanwhile, the debate over global warming seems to have waned. Few authorities are disputing the escalating concentrations of several greenhouse gases due to the burning of fossil fuels. The E.U. is attempting to decrease emissions through its 2006 Energy Efficiency directive. Many industry observers in the United States believe that there will likely be federal regulation of greenhouse gases within the next three years.

A smart grid would address many of these issues, giving options to the consumer to manage their usage and costs. By optimizing asset utilization, the smart grid will provide savings in that there is less need to build more power plants to meet increased electricity demand. As a self-healing grid that detects, responds and restores functions, the smart grid can greatly reduce the economic impact of blackout and power interruption grid failures.

A smart grid that provides the needed power quality can ensure the strong and resilient energy infrastructure necessary for the 21st-century economy. A smart grid also offers consumers options for managing their usage and costs. Further, a smart grid will enable plug-and-play integration of renewables, distributed resources and control systems. lastly, a smart grid will better enable plug-and-play integration of renewables, distributed resources and control systems.

INCENTIVES FOR MODERNIZATION

despite all of these potential benefits, more incentives are needed to drive grid modernization efforts. Several mechanisms are available to encourage investment. Some utilities are already using or evaluating alternative rate structures such as net metering and revenue decoupling to give utilities and consumer incentives to use less energy. net metering awards energy incentives or credit for consumer-based renewables. And revenue decoupling is a mechanism designed to eliminate or reduce dependence of a utility’s revenues on sales. Other programs – such as energy-efficiency or demand-reduction incentives – motivate consumers and businesses to adopt long-term energy-efficient behaviors (such as using programmable thermostats) and to consider energy efficiency when using appliances and computers, and even operating their homes.

Policy and regulatory strategy should incorporate these means and include others, such as accelerated depreciation and tax incentives. Accelerated depreciation encourages businesses to purchase new assets, since depreciation is steeper in the earlier years of the asset’s life and taxes are deferred to a later period. Tax incentives could be put in place for purchasing smart grid components. Utility Commissions could require utilities to consider all societal benefits, rather than just rate impacts, when crafting the business case. Utilities could take federal income tax credits for the investments. leaders could include smart grid technologies as a critical component of their overall energy policy.

Only when all of these policies and incentives are put in place will smart grids truly become a reality.