Successful Smart Grid Architecture

The smart grid is progressing well on several fronts. Groups such as the Grid Wise Alliance, events such as Grid Week, and national policy citations such as the American Recovery and Reinvestment Act in the U.S., for example, have all brought more positive attention to this opportunity. The boom in distributed renewable energy and its demands for a bidirectional grid are driving the need forward, as are sentiments for improving consumer control and awareness, giving customers the ability to engage in real-time energy conservation.

On the technology front, advances in wireless and other data communications make wide-area sensor networks more feasible. Distributed computation is certainly more powerful – just consider your iPod! Even architectural issues such as interoperability are now being addressed in their own forums such as Grid Inter-Op. It seems that the recipe for a smart grid is coming together in a way that many who envisioned it would be proud. But to avoid making a gooey mess in the oven, an overall architecture that carefully considers seven key ingredients for success must first exist.

Sources of Data

Utilities have eons of operational data: both real time and archival, both static (such as nodal diagrams within distribution management systems) and dynamic (such as switching orders). There is a wealth of information generated by field crews, and from root-cause analyses of past system failures. Advanced metering infrastructure (AMI) implementations become a fine-grained distribution sensor network feeding communication aggregation systems such as Silver Springs Network’s Utility IQ or Trilliant’s Secure Mesh Network.

These data sources need to be architected to be available to enhance, support and provide context for real-time data coming in from new intelligent electronic devices (IEDs) and other smart grid devices. In an era of renewable energy sources, grid connection controllers become yet another data source. With renewables, micro-scale weather forecasting such as IBM Research’s Deep Thunder can provide valuable context for grid operation.

Data Models

Once data is obtained, in order to preserve its value in a standard format, one can think in terms of an extensible markup language (XML)-oriented database. Modern implementations of these databases have improved performance characteristics, and the International Engineering Consortium (IEC) common information/ generic interface definition (CIM/GID) model, though oriented more to assets than operations, is a front-running candidate for consideration.

Newer entries, such as device language message specification – coincidence-ordered subsets expectation maximization (DLMS-COSEM) for AMI, are also coming into practice. Sometimes, more important than the technical implementation of the data, however, is the model that is employed. A well-designed data model not only makes exchange of data and legacy program adjustments easier, but it can also help the applicability of security and performance requirements. The existence of data models is often a good indicator of an intact governance process, for it facilitates use of the data by multiple applications.

Communications

Customer workshops and blueprinting sessions have shown that one of the most common issues needing to be addressed is the design of the wide-area communication system. Data communications architecture affects data rate performance, the cost of distributed intelligence and the identification of security susceptibilities.

There is no single communications technology that is suitable for all utilities, or even for all operational areas across any individual utility. Rural areas may be served by broadband over powerline (BPL), while urban areas benefit from multi-protocol label switching (MPLS) and purpose- designed mesh networks, enhanced by their proximity to fiber.

In the future, there could be entirely new choices in communications. So, the smart grid architect needs to focus on security, standardized interfaces to accept new technology, enablement of remote configuration of devices to minimize any touching of smart grid devices once installed, and future-proofing the protocols.

The architecture should also be traceable to the business case. This needs to include probable use cases that may not be in the PUC filing, such as AMI now, but smart grid later. Few utilities will be pleased with the idea of a communication network rebuild within five years of deploying an AMI-only network.

Communications architecture must also consider power outages, so battery backup, solar recharging, or other equipment may be required. Even arcane details such as “Will the antenna on a wireless device be the first thing to blow off in a hurricane?” need to be considered.

Security

Certainly, the smart grid’s purpose is to enhance network reliability, not lower its security. But with the advent of North American Reliability Corp. Critical Infrastructure Protection (NERC-CIP), security has risen to become a prime consideration, usually addressed in phase one of the smart grid architecture.

Unlike the data center, field-deployed security has many new situations and challenges. There is security at the substation – for example, who can access what networks, and when, within the control center. At the other end, security of the meter data in a proprietary AMI system needs to be addressed so that only authorized applications and personnel can access the data.

Service oriented architecture (SOA) appliances are network devices to enable integration and help provide security at the Web services message level. These typically include an integration device, which streamlines SOA infrastructures; an XML accelerator, which offloads XML processing; and an XML security gateway, which helps provide message-level, Web-services security. A security gateway helps to ensure that only authorized applications are allowed to access the data, whether an IP meter or an IED. SOA appliance security features complement the SOA security management capabilities of software.

Proper architectures could address dynamic, trusted virtual security domains, and be combined not only with intrusion protection systems, but anomaly detection systems. If hackers can introduce viruses in data (such as malformed video images that leverage faults in media players), then similar concerns should be under discussion with smart grid data. Is messing with 300 MegaWatts (MW) of demand response much different than cyber attacking a 300 MW generator?

Analytics

A smart grid cynic might say, “Who is going to look at all of this new data?” That is where analytics supports the processing, interpretation and correlation of the flood of new grid observations. One part of the analytics would be performed by existing applications. This is where data models and integration play a key role. Another part of the analytics dimension is with new applications and the ability of engineers to use a workbench to create their customized analytics dashboard in a self-service model.

Many utilities have power system engineers in a back office using spreadsheets; part of the smart grid concept is that all data is available to the community to use modern tools to analyze and predict grid operation. Analytics may need a dedicated data bus, separate from an enterprise service bus (ESB) or enterprise SOA bus, to meet the timeliness and quality of service to support operational analytics.

A two-tier or three-tier (if one considers the substations) bus is an architectural approach to segregate data by speed and still maintain interconnections that support a holistic view of the operation. Connections to standard industry tools such as ABB’s NEPLAN® or Siemens Power Technologies International PSS®E, or general tools such as MatLab, should be considered at design time, rather than as an additional expense commitment after smart grid commissioning.

Integration

Once data is sensed, securely communicated, modeled and analyzed, the results need to be applied for business optimization. This means new smart grid data gets integrated with existing applications, and metadata locked in legacy systems is made available to provide meaningful context.

This is typically accomplished by enabling systems as services per the classic SOA model. However, issues of common data formats, data integrity and name services must be considered. Data integrity includes verification and cross-correlation of information for validity, and designation of authoritative sources and specific personnel who own the data.

Name services addresses the common issue of an asset – whether transformer or truck – having multiple names in multiple systems. An example might be a substation that has a location name, such as Walden; a geographic information system (GIS) identifier such as latitude and longitude; a map name such as nearest cross streets; a capital asset number in the financial system; a logical name in the distribution system topology; an abbreviated logical name to fit in the distribution management system graphical user interface (DMS GUI); and an IP address for the main network router in the substation.

Different applications may know new data by association with one of those names, and that name may need translation to be used in a query with another application. While rewriting the applications to a common model may seem appealing, it may very well send a CIO into shock. While the smart grid should help propagate intelligence throughout the utility, this doesn’t necessarily mean to replace everything, but it should “information-enable” everything.

Interoperability is essential at both a service level and at the application level. Some vendors focus more at the service, but consider, for example, making a cell phone call from the U.S. to France – your voice data may well be code division multiple access (CDMA) in the U.S., travel by microwave and fiber along its path, and emerge in France in a global system for mobile (GSM) environment, yet your speech, the “application level data,” is retained transparently (though technology does not yet address accents!).

Hardware

The world of computerized solutions does not speak to software alone. For instance, AMI storage consolidation addresses the concern that the volume of data coming into the utility will be increasing exponentially. As more meter data can be read in an on-demand fashion, data analytics will be employed to properly understand it all, requiring a sound hardware architecture to manage, back-up and feed the data into the analytics engines. In particular, storage is needed in the head-end systems and the meter-data management systems (MDMS).

Head-end systems pull data from the meters to provide management functionality while the MDMS collects data from head-end systems and validates it. Then the data can be used by billing and other business applications. Data in both the head-end systems and the master copy of the MDMS is replicated into multiple copies for full back up and disaster recovery. For MDMS, the master database that stores all the aggregated data is replicated for other business applications, such as customer portal or data analytics, so that the master copy of the data is not tampered with.

Since smart grid is essentially performing in real time, and the electricity business is non-stop, one must think of hardware and software solutions as needing to be fail-safe with automated redundancy. The AMI data especially needs to be reliable. The key factors then become: operating system stability; hardware true memory access speed and range; server and power supply reliability; file system redundancy such as a JFS; and techniques such as FlashCopy to provide a point-in-time copy of a logical drive.

Flash Copy can be useful in speeding up database hot backups and restore. VolumeCopy can extend the replication functionality by providing the ability to copy contents of one volume to another. Enhanced remote mirroring (Global Mirror, Global Copy and Metro Mirror) can provide the ability to mirror data from one storage system to another, over extended distances.

Conclusion

Those are seven key ingredients for designing or evaluating a recipe for success with regard to implementing the smart grid at your utility. Addressing these dimensions will help achieve a solid foundation for a comprehensive smart grid computing system architecture.

The Role of Telecommunications Providers in the Smart Grid

Utilities are facing a host of critical issues over the next 10 years. One of the major approaches to dealing with these challenges is for utilities to become much more "intelligent" through the development of Intelligent Utility Enterprises (IUE) and Smart Grids (SG). The IUE/SG will require ubiquitous communications systems throughout utility service territories, especially as automated metering infrastructure (AMI) becomes a reality. Wireless systems, such as the widespread cellular system AT&T and other public carriers already have, will play a major role in enabling these systems.

These communications must be two-way, all the way from the utility to individual homes. The Smart Grid will be a subset of the intelligent utility, enabling utility executives to make wise decisions to deal with the pending issues. Public carriers are currently positioned to support and provide a wide range of communications technologies and services such as WiFi, satellite and cellular, which it is continuing to develop to meet current and future utility needs.

Supply and demand reaching critical concern

Utilities face some formidable mountains in the near future and they must climb these in the crosshairs of regulatory, legislative and public scrutiny. Included are such things as a looming, increasing shortage of electricity which may become more critical as global warming concerns begin to compromise the ability to build large generating plants, especially those fueled by coal.

Utilities also have to contend with the growing political strength of an environmental movement that opposes most forms of generation other than those designated as "green energy." Thus, utilities face a political/legislative/regulatory perfect storm, on the one hand reducing their ability to generate electricity by conventional methods and, on the other, requiring levels of reliability they increasingly are finding it impossible to meet.

The Intelligent Utility Enterprise and Smart Grid, with AMI as a subset of the Smart Grid, as potential, partial solutions

The primary solution proposed to date, which utilities can embrace on their own without waiting for regulatory/legislative/ political clarity, is to use technology like IUEs to become much more effective organizations and to substitute intelligence in lieu of manpower with SGs. The Smart Grid evolution also will enable the general public to take part in solving these problems through demand response. A subset of that evolution will be outage management to ensure that outages are anticipated and, except where required by supply shortages, minimized rapidly and effectively.

The IUE/SG, for the first time, will enable utility executives to see exactly what is happening on the grid in real time, so they can make the critical, day-to-day decisions in an environment of increasingly high prices and diminished supply for electricity.

Wireless To Play A Major Role In Required Ubiquitous Communications

Automating the self-operating, self-healing grid – artificial intelligence

The IUE/SG obviously will require enterprise-wide digital communications to enable the rapid transfer of data between one system and another, all the way from smart meters and other in-home gateways to the boardrooms where critical decisions will be made. Already utilities have embraced service-oriented architecture (SOA), as a means of linking everything together. SOA-enabled systems are easily linked over IP, which is capable of operating over traditional wire and fiber optic communications systems, which many utilities have in place, as well as existing cellular wireless systems. Wireless communications are becoming more helpful in linking disparate systems from the home, through the distribution systems, to substations, control rooms and beyond to the enterprise. The ubiquitous utility communications of the future will integrate a wide range of systems, some of them owned by the utilities and others leased and contracted by various carriers.

The Smart Grid is a subset of the entire utility enterprise and is linked to the boardroom by various increasingly intelligent systems throughout.

Utility leadership will need vital information about the operation of the grid all the way into the home, where distributed generation, net billing, demand response reduction of voltage or current will take place. This communications network must be in real time and must provide information to all of what traditionally were called "back office" systems, but which now must be capable of collating information never before received or considered.

The distribution grid itself will have to become much more automated, self-healing, and self-operating through artificial intelligence. Traditional SCADA (supervisory control and data acquisition) will have to become more capable, and the data it collects will have to be pushed further up into the utility enterprise and to departments that have not previously dealt with real-time data.

The communications infrastructure In the past utilities typically owned much of their communications systems. Most of these systems are aged, and converting them to modern digital systems is difficult and expensive.

Utilities are likely to embrace a wide range of new and existing communications technologies as they grapple with their supply/demand disconnect problem. One of these is IP/MPLS (Internet Protocol/Multi Protocol Label Switching), which already is proven in utility communications networks as well as other industries which require mission critical communications. MPLS is used to make communications more reliable and provide the prioritization to ensure the required latency for specific traffic.

One of the advantages offered by public carriers is that their networks have almost ubiquitous coverage of utility service territories, as well as built-in switching capabilities. They also have been built to communications standards that, while still evolving, help ensure important levels of security and interoperability.

"Cellular network providers are investing billions of dollars in their networks," points out Henry L. Jones II, chief technology officer at SmartSynch, an AMI vendor and author of the article entitled "Want six billion dollars to invest in your AMI network?"

"AT&T alone will be spending 16-17 billion dollars in 2009," Jones notes. "Those investments are spent efficiently in a highly competitive environment to deliver high-speed connectivity anywhere that people live and work. Of course, the primary intent of these funds is to support mobile users with web browsing and e-mail. Communicating with meters is a much simpler proposition, and one can rely on these consumer applications to provide real-world evidence that scalability to system-wide AMI will not be a problem."

Utilities deal in privileged communications with their customers, and their systems are vulnerable to terrorism. As a result, Congress, through the Federal Energy Regulatory Authority (FERC), designated NERC as the agency responsible for ensuring security of all utility facilities, including communications.

As an example of meeting security needs at a major utility, AT&T is providing redundant communications systems over a wireless WAN for a utility’s 950 substations, according to Andrew Hebert, AT&T Area Vice President, Industry Solutions Mobility Practice. This enables the utility to meet critical infrastructure protection standards and "harden" its SCADA and distribution automation systems by providing redundant communications pathways.

SCADA communication, distributed automation, and even devices providing artificial intelligence reporting are possible with today’s modern communications systems. Latency is important in terms of automatic fault reporting and switching. The communications network must provide the delivery-time performance to this support substation automation as identified in IEEE 1646. Some wireless systems now offer latencies in the 125ms range. Some of the newer systems are designed for no more than 50ms latency.

As AMI becomes more widespread, the utility- side control of millions of in-home and in-business devices will have to be controlled and managed. Meter readings must be collected and routed to meter data management systems. While it is possible to feed all this data directly to some central location, it is likely that this data avalanche will be routed through substations for aggregation and handling and transfer to corporate WANs. As the number of meter points grows – and the number readings taken per hour and the number of in-home control signals increases, bandwidth and latency factors will have to be considered carefully.

Public cellular carriers already have interoperability (e.g., you can call someone on a cell phone although they use a different carrier), and it is likely that there will be more standardization of communications systems going forward. A paradigm shift toward national and international communications interoperability already has occurred – for example, with the global GSM standard on which the AT&T network is based. A similar shift in the communications systems utilities use is necessary and likely to come about in the next few years. It no longer is practical for utilities to have to cobble together communications with varying standards for different portions of their service territory, or different functional purposes.

Measuring Smart Metering’s Progress

Smart or advanced electricity metering, using a fixed network communications path, has been with us since pioneering installations in the US Midwest in the mid-1980s. That’s 25 years ago, during which time we have seen incredible advancements in information and communication technologies.

Remember the technologies of 1985? The very first mobile phones were just being introduced. They weighed as much as a watermelon and cost nearly $9,000 in today’s dollars. SAP had just opened its first sales office outside of Germany, and Oracle had fewer than 450 employees. The typical personal computer had a 10 megabyte hard drive, and a dot-com Internet domain was just a concept.

We know how much these technologies have changed since then, how they have been embraced by the public, and (to some degree at least) where they are going in the future. This article looks at how smart metering technology has developed over the same period. What has been the catalyst for advancements? And, most important, what does that past tell us about the future of smart metering?

Peter Drucker once said that “trying to predict the future is like trying to drive down a country road at night with no lights while looking out the back window.”

Let’s take a brief look out the back window, before driving forward.

Past Developments

Developments in the parallel field of wireless communications, with its strong standards base, are readily delineated into clear technology generations. While we cannot as easily pinpoint definitive phases of smart metering technology, we can see some major transitions and discern patterns from the large deployments illustrated in Figure 1, and perhaps, even identify three broad smart metering “generations.”

The first generation is probably the clearest to delineate. The first 10 years of smart metering deployments (until about 2004) were all one-way wireless, limited two-way wireless, or very low-bandwidth power-line carrier communications (PLC) to the meter, concentrated in the U.S. The market at this time was dominated by Distribution Control Systems, Inc. (DCSI) and, what was then, CellNet Data Systems, Inc. Itron Fixed Network 2.0 and Hunt Technologies’ TS1 solution would also fit into this generation.

More than technology, the strongest characteristic of this first generation is the limited scope of business benefits considered. With the exception of Puget Sound Energy’s time-of-use pricing program, the business case for these early deployments was focused almost exclusively on reducing meter reading costs. Effectively, these early deployments reproduced the same business case as mobile automated meter reading (AMR).

By 2004, approximately 10 million of these smart meters had been installed in the U.S. (about 7 percent of the national total); however, whatever public perception of smart metering there was at the time was decidedly mixed. The deployments received scant media coverage, which focused almost solely on troubled time-of-use pricing programs, perhaps digressing briefly to cover smart metering vendor mergers and lawsuits. But generally smart meters, by any name, were unknown among the general population.

Today’s Second Generation

By the early 2000s, some utilities, notably PPL and PECO, both in Pennsylvania, were beginning to expand the use of their smart metering infrastructure beyond the simple meter-to-cash process. With incremental enhancements to application integration that were based on first generation technology, they were initiating projects to use smart metering to: transform outage identification and response; explore more frequent reading and more granular data; and improve theft detection.

These initiatives were the first to give shape to a new perspective on smart metering, but it was power company Enel’s dramatic deployment of 30 million smart meters across Italy that crystallized the second generation.

For four years leading to 2005, Enel fully deployed key technology advancements, such as universal and integrated remote disconnect and load limiting, that previously did not exist on any real scale. These changes enabled a dramatically broader scope of business benefits as this was the first fully deployed solution designed from the ground up to look well beyond reducing meter reading costs.

The impact of Enel’s deployment and subsequent marketing campaign on smart metering developments in other countries should not be underestimated, particularly among politicians and regulators outside the U.S. In European countries, particularly Italy, and regions such as Scandinavia, the same model (and in many cases the same technology) was deployed. Enel demonstrated to the rest of the world what could be done without any high-profile public backlash. It set a competitive benchmark that had policymakers in other countries questioning progress in their jurisdictions and challenging their own utilities to achieve the same.

North American Resurgence

As significant as Enel’s deployment was on the global development of smart metering, it is not the basis for today’s ongoing smart metering technology deployments now concentrated in North America.

More than the challenges of translating a European technology to North America, the business objectives and customer environments were different. As the Enel deployment came to an end, governments and regulators – particularly those in California and Ontario – were looking for smart metering technology to be the foundation for major energy conservation and peak-shifting programs. They expected the technology to support a broad range of pricing programs, provide on-demand reads within minutes, and gather hourly interval profile data from every meter.

Utilities responded. Pacific Gas & Electric (PG&E), with a total of 9 million electric and natural gas meters, kick-started the movement. Others, notably Southern California Edison (SCE), invested the time and effort to advance the technology, championing additions such as remote firmware upgrades and home area network support.

As a result, a near dormant North American smart metering market was revived in 2007. The standard functionality we see in most smart metering specifications today and the technology basis for most planned deployments in North America was established.

These technology changes also contributed to a shift in public awareness of smart meters. As smart metering was considered by more local utilities, and more widely associated with growing interest in energy conservation, media interest grew exponentially. Between 2004 and 2008, references to smart or advanced meters (carefully excluding smart parking meters) in the world’s major newspapers nearly doubled every year, to the point where the technology is now almost common knowledge in many countries.

The Coming Third Generation

In the 25 years since smart meters were first substantially deployed, the technology has progressed considerably. While progress has not been as rapid as advancements in consumer communications technologies, smart metering developments such as universal interval data collection, integrated remote disconnect and load limiting, remote firmware upgrades and links to a home network are substantial advancements.

All of these advancements have been driven by the combination of forward-thinking government policymakers, a supportive regulator and, perhaps most important, a large utility willing to invest the time and effort to understand and demand more from the vendor community.

With this understanding of the drivers, and based on the technology deployment plans, we can map out key future smart metering technology directions. We expect to see the next generation of smart metering exhibit two dominant differences from today’s technology. This includes increased standardization across the entire smart metering solution scope and changes to back-office systems architecture that enables the extended benefits of smart metering.

Increased Standardization

The transition to the next generation of smart metering will be known more for its changes to how a smart meter works, rather than what a smart meter does.

The direct functions of a smart meter appear to be largely set. We expect to see continued incremental advancements in data quality and read reliability; improved power quality measurement; and more universal deployment of a remote disconnect and load limiting.

But how a smart meter provides these functions will further change. We believe the smart meter will become a much more integrated part of two networks: one inside the home; the other along the electricity distribution network.

Generally, an expectation of standards for communication from the meter into a home area network is well accepted by the industry – although the actual standard to be applied is still in question. As this home area network develops, we expect a smart meter to increasingly become a member of this network, rather than the principal mechanism in creating one.

As other smart grid devices are deployed further down the low voltage distribution system, we expect utilities to demand that the meter conform to these network communications standards. In other words, utilities will continue to reject the idea that other types of smart grid devices – those with even greater control of the electrical network – be incorporated into a proprietary smart meter local area network.

It appears that most of this drive to standardization will not be led by utilities in North America. For one, technology decisions in North America are rapidly being completed (for this first round of replacements, at least). The recent Federal Regulatory Energy Commission (FERC) staff report, entitled “2008 Assessment of Demand Response and Advanced Metering” found that of the 145 million meters in the U.S., utilities have already contracted to replace nearly 52 million with smart meters over the next five to seven years.

IBM’s analysis indicated that larger utilities have declared plans to replace these meters even faster – approximately 33 million smart meters by 2013. The meter communications approach, and quite often the vendors chosen for these deployments, has typically already been selected, leaving little room to fundamentally change the underlying technological approach.

Outside of Worldwide Interoperability for Microwave Access (WiMAX) experiments by utilities such as American Electric Power (AEP) and those in Ontario, and shared services initiatives in Texas and Ontario, none of the remaining large North American utilities appear to have a compelling need to drive dramatic technology advancements, given rate and time pressures from regulators.

Conversely, a few very large European programs are poised to push the technology toward much greater standards adoption:

  • EDF in France has started a trial of 300,000 meters following standard PLC communications from the meter to the concentrator. The full deployment to all 35 million EDF meters is expected to follow.
  • The U.K. government recently announced a mandatory replacement of both electricity and natural gas meters for all 46 million customers between 2010 and 2020. The U.K.’s unique market structure with competitive retailers having responsibility for meter ownership and operation is driving interoperability standards beyond currently available technology.
  • With its PRIME initiative, the Spanish utility Iberdrola plans to develop a new PLC-based, open standard for smart metering. It is starting with a pilot project in 2009, leading to full deployment to more than 10 million residential customers.

The combination of these three smart metering projects alone will affect 91 million smart meters, equal to two thirds of the total U.S. market. This European focus is expected to grow now that the Iberdrola project has taken the first steps to be the basis for the European Commission’s Open Meter initiative, involving 19 partners from seven European countries.

Rethinking Utility System Architectures

Perhaps the greatest changes to future smart metering systems will have nothing to do with the meter itself.

To date, standard utility applications for customer care and billing, outage management, and work management have been largely unchanged by smart metering. In fact, to reduce risk and meet schedules, utilities have understandably shielded legacy systems from the changes needed to support a smart meter rollout or new tariffs. They have looked to specialized smart metering systems, particularly meter data management systems (MDMS), to bridge the gap between a new smart metering infrastructure and their legacy systems.

As a result, many of the potential benefits of a smart metering infrastructure have yet to be fully realized. For instance, billing systems still operate on cycles set by past meter reading routes. Most installed outage management applications are unable to take advantage of a direct near-real-time connection to nearly every end point.

As application vendors catch up, we expect the third generation of smart meters to be characterized by changes to the overall utility architectures and the applications that comprise them. As applications are enhanced, and enterprise architectures adapted to the smart grid, we expect to see significant architectural changes, such as:

  • Much of the message brokering functions from disparate head-end systems to utility applications in an MDMS will migrate to the utility’s service bus.
  • As smart meters increasingly become devices on a standards-based network, more general network management applications now widely deployed for telecommunications networks will supplement vendor head-end systems.
  • Complex estimating and editing functions will become less valuable as the technology in the field becomes more reliable.
  • Security of the system, from home network to the utility firewall, needs to meet the much higher standards associated with grid operations, rather than those arising from the current meter-as-the-cash-register perspective.
  • Add-on functionality provided by some niche vendors will migrate to larger utility systems as they evolve to a smart metering world. For instance, Web presentment of interval data to customers will move from dedicated sites to become a broad part of utilities’ online offerings.

Conclusions

Looking back at 25 years of smart metering technology development, we can see that while it has progressed, it has not developed at the pace of the consumer communications and computing technologies they rely upon – and for good reasons.

Utilities operate under a very different investment timeframe compared to consumer electronics; decisions made by utilities today need to stand for decades, rather than mere months. While consumer expectations of technology and service continue to grow with each generation, in the regulated electricity distribution industry, any customer demands are often filtered through a blurry political and regulatory lens.

Even with these constraints, smart metering technology has evolved rapidly, and will continue to change in the future. The next generation, with increased standardized integration with other networks and devices, as well as changes to back office systems, will certainly transform what we now call smart metering. So much so, that much sooner than 25 years from now, those looking back at today’s smart meters may very well see them as we now see those watermelon-sized cell phones of the 1980’s.

Policy and Regulatory Initiatives And the Smart Grid

Public policy is commonly defined as a plan of action designed to guide decisions for achieving a targeted outcome. In the case of smart grids, new policies are needed if smart grids are actually to become a reality. This statement may sound dire, given the recent signing into law of the 2007 Energy Independence and Security Act (EISA) in the United States. And in fact, work is underway in several countries to encourage smart grids and smart grid components such as smart metering. However, the risk still exists that unless stronger policies are enacted, grid modernization investments will fail to leverage the newer and better technologies now emerging, and smart grid efforts will never move beyond demonstration projects. This would be an unfortunate result when you consider the many benefits of a true smart grid: cost savings for the utility, reduced bills for customers, improved reliability and better environmental stewardship.

REGIONAL AND NATIONAL EFFORTS

As mentioned above, several regions are experimenting with smart grid provisions. At the national level, the U.S. federal government has enacted two pieces of legislation that support advanced metering and smart grids. The Energy Policy Act of 2005 directed U.S. utility regulators to consider time-of-use meters for their states. The 2007 EISA legislation has several provisions, including a list of smart grid goals to encourage two-way, real-time digital networks that stretch from a consumer’s home to the distribution network. The law also provides monies for regional demonstration projects and matching grants for smart grid investments. The EISA legislation also mandates the development of an “interoperability framework.”

In Europe, the European Union (E.U.) introduced a strategic energy technology plan in 2006 for the development of a smart electricity system over the next 30 years. The European Technology Platform organization includes representatives from industry, transmission and distribution system operators, research bodies and regulators. The organization has identified objectives and proposes a strategy to make the smart grid vision a reality.

Regionally, several U.S. states and Canadian provinces are focused on smart grid investments. In Canada, the Ontario Energy Board has mandated smart meters, with meter installation completion anticipated by 2010. In Texas, the Public Utilities Commission of Texas (PUCT) has finalized advanced metering legislation that authorizes metering cost recovery through surcharges. The PUCT also stipulated key components of an advanced metering system: two-way communications, time-date stamp, remote connect/disconnect, and access to consumer usage for both the consumer and the retail energy provider. The Massachusetts State Senate approved an energy bill that includes smart grid and time-of-use pricing. The bill requires that utilities submit a plan by Sept. 1, 2008, to the Massachusetts Public Utilities Commission, establishing a six-month pilot program for a smart grid. Most recently, California, Washington state and Maryland all introduced smart grid legislation.

AN ENCOMPASSING VISION

While these national and regional examples represent just a portion of the ongoing activity in this area, the issue remains that smart grid and advanced metering pilot programs do not guarantee a truly integrated, interoperable, scalable smart grid. Granted, a smart grid is not achieved overnight, but an encompassing smart grid vision should be in place as modernization and metering decisions are made, so that investment is consistent with the plan in mind. Obviously, challenges – such as financing, system integration and customer education – exist in moving from pilot to full grid deployment. However, many utility and regulatory personnel perceive these challenges to be ones of costs and technology readiness.

The costs are considerable. KEMA, the global energy consulting firm, estimates that the average cost of a smart meter project (representing just a portion of a smart grid project) is $775 million. The E.U.’s Strategic Energy Technology Plan estimates that the total smart grid investment required could be as much as $750 billion. These amounts are staggering when you consider that the market capitalization of all U.S. investor-owned electric utilities is roughly $550 billion. However, they’re not nearly as significant when you subtract the costs of fixing the grid using business-as-usual methods. Transmission and distribution expenditures are occurring with and without intelligence. The Energy Information Administration (EIA) estimates that between now and 2020 more than $200 billion will be spent to maintain and expand electricity transmission and distribution infrastructures in the United States alone.

Technology readiness will always be a concern in large system projects. Advances are being made in communication, sensor and security technologies, and IT. The Federal Communications Commission is pushing for auctions to accelerate adoption of different communication protocols. Price points are decreasing for pervasive cellular communication networks. Electric power equipment manufacturers are utilizing the new IEC 61850 standard to ensure interoperability among sensor devices. vendors are using approaches for security products that will enable north American Electric Reliability Corp. (nERC) and critical infrastructure protection (CIP) compliance.

In addition, IT providers are using event-driven architecture to ensure responsiveness to external events, rather than processing transactional events, and reaching new levels with high-speed computer analytics. leading service-oriented architecture companies are working with utilities to establish the underlying infrastructure critical to system integration. Finally, work is occurring in the standards community by the E.U., the Gridwise Architecture Council (GAC), Intelligrid, the national Energy Technology laboratory (nETl) and others to create frameworks for linking communication and electricity interoperability among devices, systems and data flows.

THE TIME IS NOW

These challenges should not halt progress – especially when one considers the societal benefits. Time stops for no one, and certainly in the case of the energy sector that statement could not be more accurate. Energy demand is increasing. The Energy Information Administration estimates that annual energy demand will increase roughly 50 percent over the next 25 years. Meanwhile, the debate over global warming seems to have waned. Few authorities are disputing the escalating concentrations of several greenhouse gases due to the burning of fossil fuels. The E.U. is attempting to decrease emissions through its 2006 Energy Efficiency directive. Many industry observers in the United States believe that there will likely be federal regulation of greenhouse gases within the next three years.

A smart grid would address many of these issues, giving options to the consumer to manage their usage and costs. By optimizing asset utilization, the smart grid will provide savings in that there is less need to build more power plants to meet increased electricity demand. As a self-healing grid that detects, responds and restores functions, the smart grid can greatly reduce the economic impact of blackout and power interruption grid failures.

A smart grid that provides the needed power quality can ensure the strong and resilient energy infrastructure necessary for the 21st-century economy. A smart grid also offers consumers options for managing their usage and costs. Further, a smart grid will enable plug-and-play integration of renewables, distributed resources and control systems. lastly, a smart grid will better enable plug-and-play integration of renewables, distributed resources and control systems.

INCENTIVES FOR MODERNIZATION

despite all of these potential benefits, more incentives are needed to drive grid modernization efforts. Several mechanisms are available to encourage investment. Some utilities are already using or evaluating alternative rate structures such as net metering and revenue decoupling to give utilities and consumer incentives to use less energy. net metering awards energy incentives or credit for consumer-based renewables. And revenue decoupling is a mechanism designed to eliminate or reduce dependence of a utility’s revenues on sales. Other programs – such as energy-efficiency or demand-reduction incentives – motivate consumers and businesses to adopt long-term energy-efficient behaviors (such as using programmable thermostats) and to consider energy efficiency when using appliances and computers, and even operating their homes.

Policy and regulatory strategy should incorporate these means and include others, such as accelerated depreciation and tax incentives. Accelerated depreciation encourages businesses to purchase new assets, since depreciation is steeper in the earlier years of the asset’s life and taxes are deferred to a later period. Tax incentives could be put in place for purchasing smart grid components. Utility Commissions could require utilities to consider all societal benefits, rather than just rate impacts, when crafting the business case. Utilities could take federal income tax credits for the investments. leaders could include smart grid technologies as a critical component of their overall energy policy.

Only when all of these policies and incentives are put in place will smart grids truly become a reality.

Is Your Mobile Workforce Truly Optimized?

ClickSoftware is the leading provider of mobile workforce management and service optimization solutions that create business value for service operations through higher levels of productivity, customer satisfaction and cost effectiveness. Combining educational, implementation and support services with best practices and its industry leading solutions, ClickSoftware drives service decision making across all levels of the organization.

Our mobile workforce management solution helps utilities empower mobile workers with accurate, real-time information for optimum service and quick on-site decision making. From proactive customer demand forecasting and capacity planning to real-time decision-making, incorporating scheduling, mobility and location-based services, ClickSoftware helps service organizations get the most out of their resources.

The IBM-ClickSoftware alliance provides the most comprehensive offering for Mobile Workforce and Asset Management powering the real-time service enterprise. Customers can benefit from maximized workforce productivity and customer satisfaction while controlling, and then minimizing, operational costs.

ClickSoftware provides a flexible, scalable and proven solution that has been deployed at many utility companies around the world. Highlights include the ability to:

  • Automatically update the schedule based on real-time information from the field;
  • Manage crews (parts and people);
  • Cover a wide variety of job types within one product – from short jobs requiring one person to multistage jobs needing a multi-person team over several days or weeks;
  • Balance regulatory, environmental and union compliance;
  • Continuously strive to raise the bar in operational excellence;
  • Incorporate street-level routing into the decision-making process; and
  • Plan for the catastrophic events and seasonal variability in field service operations.

The resulting value proposition to the customer is extremely compelling:

  • Typically, optimized scheduling and routing of the mobile workforce generates a 31 percent increase in jobs per day versus the industry average (Source: AFSMI survey 2003).
  • A variety of solutions, ranging from entry level to advanced, directly address the broad spectrum of pains experienced by service organizations around the world, including optimized scheduling, routing, mobile communications and integration of solutions components – within the service optimization solution itself and also into the CRM/ERP/EAM back end.
  • An entry level offering with a staged upgrade path toward a fully automated service optimization solution ensures that risk is managed and the most challenging of customer requirements may be met. This "least risk" approach for the customer is delivered by a comprehensive set of IBM business consulting, installation and support services.
  • The industry-proven credibility of ClickSoftware’s ServiceOptimization Suite, combined with IBM’s wireless middleware, software, hardware and business consulting services, provides the customer with the most effective platform for managing field service operations.

ClickSoftware’s customers represent a cross section of leaders in the utilities, telecommunications, computer and office equipment, home services, and capital equipment industries. Close to 100 customers around the world have employed ClickSoftware service optimization solutions and services to achieve optimal levels of field service.

To find out more visit www.clicksoftware.com or call 888.438.3308.