The Power of Prediction: Improving the Odds of a Nuclear Renaissance

After 30 years of disfavor in the United States, the nuclear power industry is poised for resurgence. With the passage of the Energy Policy Act of 2005, the specter of over $100 per barrel oil prices and the public recognition that global warming is real, nuclear power is now considered one of the most practical ways to clean up the power grid and help the United States reduce its dependence on foreign oil. The industry has responded with a resolve to build a new fleet of nuclear plants in anticipation of what has been referred to as a nuclear renaissance.

The nuclear power industry is characterized by a remarkable level of physics and mechanical science. Yet, given the confluence of a number of problematic issues – an aging workforce, the shortage of skilled trades, the limited availability of equipment and parts, and a history of late, over-budget projects – questions arise about whether the level of management science the industry plans to use is sufficient to navigate the challenges ahead.

According to data from the Energy Information Administration (EIA), nuclear power comprises 20 percent of the U.S. capacity, producing approximately 106 gigawatts (GW), with 66 plants that house 104 reactor units. To date, more than 30 new reactors have been proposed, which will produce a net increase of approximately 19 GW of nuclear capacity through 2030. Considering the growth of energy demand, this increased capacity will barely keep pace with increasing base load requirements.

According to Assistant Secretary for Nuclear Energy Dennis Spurgeon, we will need approximately 45 new reactors online by 2030 just to maintain 20 percent share of U.S. electricity generation nuclear power already holds.

Meanwhile, Morgan Stanley vice chairman Jeffrey Holzschuh is very positive about the next generation of nuclear power but warns that the industry’s future is ultimately a question of economics. “Given the history, the markets will be cautious,” he says.

As shown in Figures 1-3, nuclear power is cost competitive with other forms of generation, but its upfront capital costs are comparatively high. Historically, long construction periods have led to serious cost volatility. The viability of the nuclear power industry ultimately depends on its ability to demonstrate that plants can be built economically and reliably. Holzschuh predicts, “The first few projects will be under a lot of public scrutiny, but if they are approved, they will get funded. The next generation of nuclear power will likely be three to five plants or 30, nothing in between.”

Due to its cohesive identity, the nuclear industry is viewed by the public and investors as a single entity, making the fate of industry operators – for better or for worse – a shared destiny. For that reason, it’s widely believed that if these first projects suffer the same sorts of significant cost over-runs and delays experienced in the past, the projected renaissance for the industry will quickly revert to a return to the dark ages.

THE PLAYERS

Utility companies, regulatory authorities, reactor manufacturers, design and construction vendors, financiers and advocacy groups all have critical roles to play in creating a viable future for the nuclear power industry – one that will begin with the successful completion of the first few plants in the United States. By all accounts, an impressive foundation has been laid, beginning with an array of government incentives (as loan guarantees and tax credits) and simplified regulation to help jump-start the industry.

Under the Energy Policy Act of 2005, the U.S. Department of Energy has the authority to issue $18.5 billion in loan guarantees for new nuclear plants and $2 billion for uranium enrichment projects. In addition, there’s standby support for indemnification against Nuclear Regulatory Commission (NRC) and litigation-oriented delays for the first six advanced nuclear reactors. The Treasury Department has issued guidelines for an allocation and approval process for production tax credits for advanced nuclear: 1.8 cents per kilowatt-hour production tax credit for the first eight years of operation with the final rules to be issued in fiscal year 2008.

The 20-year renewal of the Price- Andersen Act in 2005 and anticipated future restrictions on carbon emissions further improve the comparative attractiveness of nuclear power. To be eligible for the 2005 production tax credits, a license application must be tendered to the NRC by the end of 2008 with construction beginning before 2014 and the plant placed in service before 2021.

The NRC has formulated an Office of New Reactors (NRO), and David Matthews, director of the Division of New Reactor Licensing, led the development of the latest revision of a new licensing process that’s designed to be more predictable by encouraging the standardization of plant designs, resolving safety and environmental issues and providing for public participation before construction begins. With a fully staffed workforce and a commitment to “enable the safe, secure and environmentally responsible use of nuclear power in meeting the nation’s future energy needs,” Matthews is determined to ensure that the NRC is not a risk factor that contributes to the uncertainty of projects but rather an organizing force that will create predictability. Matthews declares, “This isn’t your father’s NRC.”

This simplified licensing process consists of the following elements:

  • An early site permit (ESP) for locations of potential facilities.
  • Design certification (DC) for the reactor design to be used.
  • Combined operating license (COL) for the certified reactor as designed to be located on the site. The COL contains the inspections, tests, analyses and acceptance criteria (ITAAC) to demonstrate that the plant was built to the approved specifications.

According to Matthews, the best-case scenario for the time period between when a COL is docketed to the time the license process is complete is 33 months, with an additional 12 months for public hearings. When asked if anything could be done to speed this process, Matthews reported that every delay he’s seen thus far has been attributable to a cause beyond the NRC’s control. Most often, it’s the applicant that’s having a hard time meeting the schedule. Recently, approved schedules are several months longer than the best-case estimate.

The manufacturers of nuclear reactors have stepped up to the plate to achieve standard design certification for their nuclear reactors; four are approved, and three are in progress.

Utility companies are taking innovative approaches to support the NRC’s standardization principles, which directly impact costs. (Current conventional wisdom puts the price of a new reactor at between $4 billion and $5.5 billion, with some estimates of fully loaded costs as high as $7 billion.) Consortiums have been formed to support cross-company standardization around a particular reactor design. NuStart and UniStar are multi-company consortiums collaborating on the development of their COLs.

Leader of PPL Corp.’s nuclear power strategy Bryce Shriver – who recently announced PPL had selected UniStar to build its next nuclear facility – is impressed with the level of standardization UniStar is employing for its plants. From the specifics of the reactor design to the carpet color, UniStar – with four plants on the drawing board – intends to make each plant as identical as possible.

Reactor designers and construction companies are adding to the standardization with turnkey approaches, formulating new construction methods that include modular techniques; sophisticated scheduling and configuration management software; automated data; project management and document control; and designs that are substantially complete before construction begins. Contractors are taking seriously the lessons learned from plants built outside the United States, and they hope to leverage what they have learned in the first few U.S. projects.

The stewards of the existing nuclear fleet also see themselves as part of the future energy solution. They know that continued safe, high-performance operation of current plants is key to maintaining public and state regulator confidence. Most of the scheduled plants are to be co-located with existing nuclear facilities.

Financing nuclear plant construction involves equity investors, utility boards of directors, debt financiers and (ultimately) the ratepayers represented by state regulatory commissions. Despite the size of these deals, the financial community has indicated that debt financing for new nuclear construction will be available. The bigger issue lies with the investors. The more equity-oriented the risk (principally borne by utilities and ratepayers), the more caution there is about the structure of these deals. The debt financiers are relying on the utilities and the consortiums to do the necessary due diligence and put up the equity. There’s no doubt that the federal loan guarantees and subsidies are an absolute necessity, but this form of support is largely driven by the perceived risk of the first projects. Once the capability to build plants in a predictable way (in terms of time, cost, output and so on) has been demonstrated, market forces are expected to be very efficient at allocating capital to these kinds
of projects.

The final key to the realization of a nuclear renaissance is the public. Americans have become increasingly concerned about fossil fuels, carbon emissions and the nation’s dependence on foreign oil. The surge in oil prices has focused attention on energy costs and national security. Coal-based energy production is seen as an environmental issue. Although the United States has plenty of access to coal, dealing with carbon emissions using clean coal technology involves sequestering it and pumping it underground. PPL chairman Jim Miller describes the next challenge for clean coal as NUMBY – the “Not under my back yard” attitude the public is likely to adopt if forced to consider carbon pumped under their communities. Alternative energy sources such as wind, solar and geothermal enjoy public support, but they are not yet scalable for the challenge of cleaning up the grid. In general, the public wants clean, safe, reliable, inexpensive power.

THE RISKS

Will nuclear fill that bill and look attractive compared with the alternatives? Although progress has been made and the stage is set, critical issues remain, and they could become problematic. While the industry clearly sees and is actively managing some of these issues, there are others the industry sees but is not as certain about how to manage – and still others that are so much a part of the fabric of the industry that they go unrecognized. Any one of these issues could slow progress; the fact that there are several that could hit simultaneously multiplies the risk exponentially.

The three widely accepted risk factors for the next phase of nuclear power development are the variability of the cost of uranium, the availability of quality equipment for construction and the availability of well-trained labor. Not surprising for an industry that’s been relatively sleepy for several decades, the pipeline for production resources is weak – a problem compounded by the well-understood coming wave of retirements in the utility workforce and the general shortage of skilled trades needed to work on infrastructure projects. Combine these constraints with a surge in worldwide demand for power plants, and it’s easy to understand why the industry is actively pursuing strategies to secure materials and train labor.

The reactor designers, manufacturers and construction companies that would execute these projects display great confidence. They’re keen on the “turnkey solution” as a way to reduce the risk of multiple vendors pointing fingers when things go wrong. Yet these are the same firms that have been openly criticized for change orders and cost overruns. Christopher Crane, chief operating officer of the utility Exelon Corp., warned contractors in a recent industry meeting that the utilities would “not take all the risk this time around.” When faced with complicated infrastructure development in the past, vendors have often pointed to their expertise with complex projects. Is the development of more sophisticated scheduling and configuration management capability, along with the assignment of vendor accountability, enough to handle the complexity issue? The industry is aware of this limitation but does not as yet have strong management techniques for handling it effectively.

Early indications from regulators are that the COLs submitted to date are not meeting the NRC’s guidance and expectations in all regards, possibly a result of the applicants’ rush to make the 2008 year-end deadline for the incentives set forth in the Energy Policy Act. This could extend the licensing process and strain the resources of the NRC. In addition, the requirements of the NRC principally deal with public safety and environmental concerns. There are myriad other design requirements entailed in making a plant operate profitably.

The bigger risk is that the core strength of the industry – its ability to make significant incremental improvements – could also serve as the seed of its failure as it faces this next challenge. Investors, state regulators and the public are not likely to excuse serious cost overruns and time delays as they may have in the past. Utility executives are clear that nuclear is good to the extent that it’s economical. When asked what single concern they find most troubling, they often reply, “That we don’t know what we don’t know.”

What we do know is that there are no methods currently in place for beginning successful development of this next generation of nuclear power plants, and that the industry’s core management skill set may not be sufficient to build a process that differs from a “learn as you go” approach. Thus, it’s critical that the first few plants succeed – not just for their investors but for the entire industry.

THE OPPORTUNITY – KNOWING WHAT YOU DON’T KNOW

The vendors supporting the nuclear power industry represent some of the most prestigious engineering and equipment design and manufacturing firms in the world: Bechtel, Fluor, GE, Westinghouse, Areva and Hitachi. Despite this, the industry is not known for having a strong foundation in managing innovation. In a world that possesses complex physical capital and myriad intangible human assets, political forces and public opinion as well as technology are all required to get a plant to the point of producing power. Thus, more advanced management science could represent the missing piece of the puzzle for the nuclear power industry.

An advanced, decision-making framework can help utilities manage unpredictable events, increasing their ability to handle the planning and anticipated disruptions that often beset long, complex projects. By using advanced management science, the nuclear industry can take what it knows and create a learning environment to fi nd out more about what it doesn’t know, improving its odds for success.

Microsoft Helps Utilities Use IT to Create Winning Relationships

The utilities industry worldwide is experiencing growing energy demand in a world with shifting fuel availability, increasing costs, a shrinking workforce and mounting global environmental pressures. Rate case filings and government regulations, especially those regarding environmental health and safety, require utilities to streamline reporting and operate safely enterprise-wide. At the same time, increasing competition and costs drive the need for service reliability and better customer service. Each issue causes utilities to depend more and more on information technology (IT).

The Microsoft Utility team works with industry partners to create and deploy industry-specific solutions that help utilities transform challenges into opportunities and empower utilities workers to thrive in today’s market-driven environment. Solutions are based on the world’s most cost-effective, functionally rich, and secure IT platform. The Microsoft platform is interoperable with a wide variety of systems and proven to improve people’s abilities to access information and work with others across boundaries. Together, they help utilities optimize operations in each line of business.

Customer care. Whether a utility needs to modernize a call center, add customer self-service or respond to new business requirements such as green power, Microsoft and its partners provide solutions for turning the customer experience into a powerful competitive advantage with increased cost efficiencies, enhanced customer service and improved financial performance.

Transmission and distribution. Growing energy demand makes it critical to effectively address safe, reliable and efficient power delivery worldwide. To help utilities meet these needs, Microsoft and its partners offer EMS, DMS and SCADA systems; mobile workforce management solutions; project intelligence; geographic information systems; smart metering/grid; and work/asset/document management tools that streamline business processes and offer connectivity across the enterprise and beyond.

Generation. Microsoft and its partners provide utilities with a view across and into their generation operations that enables them to make better decisions to improve cycle times, output and overall effectiveness while reducing the carbon footprint. With advanced software solutions from Microsoft and its partners, utilities can monitor equipment to catch early failure warnings, measure fleets’ economic performance and reduce operational and environment risk.

Energy trading and risk management. Market conditions require utilities to optimize energy supply performance. Microsoft and its partners’ enterprise risk management and trading solutions help utilities feed the relentless energy demands in a resource-constrained world.

Regulatory compliance. Microsoft and its partners offer solutions to address the compliance requirements of the European Union; Federal Energy Regulatory Commission; North American Reliability Council; Sarbanes-Oxley Act of 2000; Environmental, Health and Safety; and other regional jurisdiction regulations and rate case issues. With solutions from Microsoft partners, utilities have a proactive approach to compliance, the most effective way to manage operational risk across the enterprise.

Enterprise. To optimize their businesses, utility executives need real-time visibility across the enterprise. Microsoft and its partners provide integrated e-business solutions that help utilities optimize their interactions with customers, vendors and partners. These enterprise applications address business intelligence and reporting, customer relationship management, collaborative workspaces, human resources and financial management.

The GridWise Olympic Peninsula Project

The Olympic Peninsula Project consisted of a field demonstration and test of advanced price signal-based control of distributed energy resources (DERs). Sponsored by the U.S. Department of Energy (DOE) and led by the Pacific Northwest National Laboratory, the project was part of the Pacific Northwest Grid- Wise Testbed Demonstration.

Other participating organizations included the Bonneville Power Administration, Public Utility District (PUD) #1 of Clallam County, the City of Port Angeles, Portland General Electric, IBM’s T.J. Watson Research Center, Whirlpool and Invensys Controls. The main objective of the project was to convert normally passive loads and idle distributed generation into actively participating resources optimally coordinated in near real time to reduce stress on the local distribution system.

Planning began in late 2004, and the bulk of the development work took place in 2005. By late 2005, equipment installations had begun, and by spring 2006, the experiment was fully operational, remaining so for one full year.

The motivating theme of the project was based on the GridWise concept that inserting intelligence into electric grid components at every point in the supply chain – from generation through end-use – will significantly improve both the electrical and economic efficiency of the power system. In this case, information technology and communications were used to create a real-time energy market system that could control demand response automation and distributed generation dispatch. Optimal use of the DER assets was achieved through the market, which was designed to manage the flow of power through a constrained distribution feeder circuit.

The project also illustrated the value of interoperability in several ways, as defined by the DOE’s GridWise Architecture Council (GWAC). First, a highly heterogeneous set of energy assets, associated automation controls and business processes was composed into a single solution integrating a purely economic or business function (the market-clearing system) with purely physical or operational functions (thermostatic control of space heating and water heating). This demonstrated interoperability at the technical and informational levels of the GWAC Interoperability Framework (www.gridwiseac.org/about/publications.aspx), providing an ideal example of a cyber-physical-business system. In addition, it represents an important class of solutions that will emerge as part of the transition to smart grids.

Second, the objectives of the various asset owners participating in the market were continuously balanced to maintain the optimal solution at any point in time. This included the residential demand response customers; the commercial and municipal entities with both demand response and distributed generation; and the utilities, which demonstrated interoperability at the organizational level of the framework.

PROJECT RESOURCES

The following energy assets were configured to respond to market price signals:

  • Residential demand response for electric space and water heating in 112 single-family homes using gateways connected by DSL or cable modem to provide two-way communication. The residential demand response system allowed the current market price of electricity to be presented to customers. Consumers could also configure their demand response automation preferences. The residential consumers were evenly divided among three contract types (fixed, time of use and real time) and a fourth control group. All electricity consumption was metered, but only the loads in price-responsive homes were controlled by the project (approximately 75 KW).
  • Two distributed generation units (175 KW and 600 KW) at a commercial site served the facility’s load when the feeder supply was not sufficient. These units were not connected in parallel to the grid, so they were bid into the market as a demand response asset equal to the total load of the facility (approximately 170 KW). When the bid was satisfied, the facility disconnected from the grid and shifted its load to the distributed generation units.
  • One distributed microturbine (30 KW) that was connected in parallel to the grid. This unit was bid into the market as a generation asset based on the actual fixed and variable expenses of running the unit.
  • Five 40-horsepower (HP) water pumps distributed between two municipal water-pumping stations (approximately 150 KW of total nameplate load). The demand response load from these pumps was incrementally bid into the market based on the water level in the pumped storage reservoir, effectively converting the top few feet of the reservoir capacity into a demand response asset on the electrical grid.

Monitoring was performed for all of these resources, and in cases of price-responsive contracts, automated control of demand response was also provided. All consumers who employed automated control were able to temporarily disable or override project control of their loads or generation units. In the residential realtime price demand response homes, consumers were given a simple configuration choice for their space heating and water heating that involved selecting an ideal set point and a degree of trade-off between comfort and price responsiveness.

For real-time price contracts, the space heater demand response involved automated bidding into the market by the space heating system. Since the programmable thermostats deployed in the project didn’t support real-time market bidding, IBM Research implemented virtual thermostats in software using an event-based distributed programming prototype called Internet- Scale Control Systems (iCS). The iCS prototype is designed to support distributed control applications that span virtually any underlying device or business process through the definition of software sensor, actuator and control objects connected by an asynchronous event programming model that can be deployed on a wide range of underlying communication and runtime environments. For this project, virtual thermostats were defined that conceptually wrapped the real thermostats and incorporated all of their functionality while at the same time providing the additional functionality needed to implement the real-time bidding. These virtual thermostats received
the actual temperature of the house as well as information about the real-time market average price and price distribution and the consumer’s preferences for set point and comfort/economy trade-off setting. This allowed the virtual thermostats to calculate the appropriate bid every five minutes based on the changing temperature and market price of energy.

The real-time market in the project was implemented as a shadow market – that is, rather than change the actual utility billing structure, the project implemented a parallel billing system and a real-time market. Consumers still received their normal utility bill each month, but in addition they received an online bill from the shadow market. This additional bill was paid from a debit account that used funds seeded by the project based on historical energy consumption information for the consumer.

The objective was to provide an economic incentive to consumers to be more price responsive. This was accomplished by allowing the consumers to keep the remaining balance in the debit account at the end of each quarter. Those consumers who were most responsive were estimated to receive about $150 at the end of the quarter.

The market in the project cleared every five minutes, having received demand response bids, distributed generation bids and a base supply bid based on the supply capacity and wholesale price of energy in the Mid-Columbia system operated by Bonneville Power Administration. (This was accomplished through a Dow Jones feed of the Mid-Columbia price and other information sources for capacity.) The market operation required project assets to submit bids every five minutes into the market, and then respond to the cleared price at the end of the five-minute market cycle. In the case of residential space heating in real-time price contract homes, the virtual thermostats adjusted the temperature set point every five minutes; however, in most cases the adjustment was negligible (for example, one-tenth of a degree) if the price was stable.

KEY FINDINGS

Distribution constraint management. As one of the primary objectives of the experiment, distribution constraint management was successfully accomplished. The distribution feeder-imported capacity was managed through demand response automation to a cap of 750 KW for all but one five-minute market cycle during the project year. In addition, distributed generation was dispatched as needed during the project, up to a peak of about 350 KW.

During one period of about 40 hours that took place from Oct. 30, 2006, to Nov. 1, 2006, the system successfully constrained the feeder import capacity at its limit and dispatched distributed generation several times, as shown in Figure 1. In this figure, actual demand under real-time price control is shown in red, while the blue line depicts what demand would have been without real-time price control. It should be noted that the red demand line steps up and down above the feeder capacity line several times during the event – this is the result of distributed generation units being dispatched and removed as their bid prices are met or not.

Market-based control demonstrated. The project controlled both heating and cooling loads, which showed a surprisingly significant shift in energy consumption. Space conditioning loads in real-time price contract homes demonstrated a significant shift to early morning hours – a shift that occurred during both constrained and unconstrained feeder conditions but was more pronounced during constrained periods. This is similar to what one would expect in preheating or precooling systems, but neither the real nor the virtual thermostats in the project had any explicit prediction capability. The analysis showed that the diurnal shape of the price curve itself caused the effect.

Peak load reduced. The project’s realtime price control system both deferred and shifted peak load very effectively. Unlike the time-of-use system, the realtime price control system operated at a fine level of precision, responding only when constraints were present and resulting in a precise and proportionally appropriate level of response. The time-of-use system, on the other hand, was much coarser in its response and responded regardless of conditions on the grid, since it was only responding to preconfiured time schedules or manually initiated critical peak price signals.

Internet-based control demonstrated. Bids and control of the distributed energy resources in the project were implemented over Internet connections. As an example, the residential thermostats modified their operation through a combination of local and central control communicated as asynchronous events over the Internet. Even in situations of intermittent communication failure, resources typically performed well in default mode until communications could be re-established. This example of the resilience of a well-designed, loosely coupled distributed control application schema is an important aspect of what the project demonstrated.

Distributed generation served as a valuable resource. The project was highly effective in using the distributed generation units, dispatching them many times over the duration of the experiment. Since the diesel generators were restricted by environmental licensing regulations to operate no more than 100 hours per year, the bid calculation factored in a sliding scale price premium such that bids would become higher as the cumulative runtime for the generators increased toward 100 hours.

CONCLUSION

The Olympic Peninsula Project was unique in many ways. It clearly demonstrated the value of the GridWise concepts of leveraging information technology and incorporating market constructs to manage distributed energy resources. Local marginal price signals as implemented through the market clearing process, and the overall event-based software integration framework successfully managed the bidding and dispatch of loads and balanced the issues of wholesale costs, distribution congestion and customer needs in a very natural fashion.

The final report (as well as background material) on the project is available at www.gridwise.pnl.gov. The report expands on the remarks in this article and provides detailed coverage of a number of important assertions supported by the project, including:

  • Market-based control was shown to be a viable and effective tool for managing price-based responses from single-family premises.
  • Peak load reduction was successfully accomplished.
  • Automation was extremely important in obtaining consistent responses from both supply and demand resources.
  • The project demonstrated that demand response programs could be designed by establishing debit account incentives without changing the actual energy prices offered by energy providers.

Although technological challenges were identified and noted, the project found no fundamental obstacles to implementing similar systems at a much larger scale. Thus, it’s hoped that an opportunity to do so will present itself at some point in the near future.

The Smart Grid: A Balanced View

Energy systems in both mature and developing economies around the world are undergoing fundamental changes. There are early signs of a physical transition from the current centralized energy generation infrastructure toward a distributed generation model, where active network management throughout the system creates a responsive and manageable alignment of supply and demand. At the same time, the desire for market liquidity and transparency is driving the world toward larger trading areas – from national to regional – and providing end-users with new incentives to consume energy more wisely.

CHALLENGES RELATED TO A LOW-CARBON ENERGY MIX

The structure of current energy systems is changing. As load and demand for energy continue to grow, many current-generation assets – particularly coal and nuclear systems – are aging and reaching the end of their useful lives. The increasing public awareness of sustainability is simultaneously driving the international community and national governments alike to accelerate the adoption of low-carbon generation methods. Complicating matters, public acceptance of nuclear energy varies widely from region to region.

Public expectations of what distributed renewable energy sources can deliver – for example, wind, photovoltaic (PV) or micro-combined heat and power (micro-CHP) – are increasing. But unlike conventional sources of generation, the output of many of these sources is not based on electricity load but on weather conditions or heat. From a system perspective, this raises new challenges for balancing supply and demand.

In addition, these new distributed generation technologies require system-dispatching tools to effectively control the low-voltage side of electrical grids. Moreover, they indirectly create a scarcity of “regulating energy” – the energy necessary for transmission operators to maintain the real-time balance of their grids. This forces the industry to try and harness the power of conventional central generation technologies, such as nuclear power, in new ways.

A European Union-funded consortium named Fenix is identifying innovative network and market services that distributed energy resources can potentially deliver, once the grid becomes “smart” enough to integrate all energy resources.

In Figure 1, the Status Quo Future represents how system development would play out under the traditional system operation paradigm characterized by today’s centralized control and passive distribution networks. The alternative, Fenix Future, represents the system capacities with distributed energy resources (DER) and demand-side generation fully integrated into system operation, under a decentralized operating paradigm.

CHALLENGES RELATED TO NETWORK OPERATIONAL SECURITY

The regulatory push toward larger trading areas is increasing the number of market participants. This trend is in turn driving the need for increased network dispatch and control capabilities. Simultaneously, grid operators are expanding their responsibilities across new and complex geographic regions. Combine these factors with an aging workforce (particularly when trying to staff strategic processes such as dispatching), and it’s easy to see why utilities are becoming increasingly dependent on information technology to automate processes that were once performed manually.

Moreover, the stochastic nature of energy sources significantly increases uncertainty regarding supply. Researchers are trying to improve the accuracy of the information captured in substations, but this requires new online dispatching stability tools. Additionally, as grid expansion remains politically controversial, current efforts are mostly focused on optimizing energy flow in existing physical assets, and on trying to feed asset data into systems calculating operational limits in real time.

Last but not least, this enables the extension of generation dispatch and congestion into distribution low-voltage grids. Although these grids were traditionally used to flow energy one way – from generation to transmission to end-users – the increasing penetration of distributed resources creates a new need to coordinate the dispatch of these resources locally, and to minimize transportation costs.

CHALLENGES RELATED TO PARTICIPATING DEMAND

Recent events have shown that decentralized energy markets are vulnerable to price volatility. This poses potentially significant economic threats for some nations because there’s a risk of large industrial companies quitting deregulated countries because they lack visibility into long-term energy price trends.

One potential solution is to improve market liquidity in the shorter term by providing end-users with incentives to conserve energy when demand exceeds supply. The growing public awareness of energy efficiency is already leading end-users to be much more receptive to using sustainable energy; many utilities are adding economic incentives to further motivate end-users.

These trends are expected to create radical shifts in transmission and distribution (T&D) investment activities. After all, traditional centralized system designs, investments and operations are based on the premise that demand is passive and uncontrollable, and that it makes no active contribution to system operations.

However, the extensive rollout of intelligent metering capabilities has the potential to reverse this, and to enable demand to respond to market signals, so that end-users can interact with system operators in real or near real time. The widening availability of smart metering thus has the potential to bring with it unprecedented levels of demand response that will completely change the way power systems are planned, developed and operated.

CHALLENGES RELATED TO REGULATION

Parallel with these changes to the physical system structure, the market and regulatory frameworks supporting energy systems are likewise evolving. Numerous energy directives have established the foundation for a decentralized electricity supply industry that spans formerly disparate markets. This evolution is changing the structure of the industry from vertically integrated, state-owned monopolies into an environment in which unbundled generation, transmission, distribution and retail organizations interact freely via competitive, accessible marketplaces to procure and sell system services and contracts for energy on an ad hoc basis.

Competition and increased market access seem to be working at the transmission level in markets where there are just a handful of large generators. However, this approach has yet to be proven at the distribution level, where it could facilitate thousands and potentially millions of participants offering energy and systems services in a truly competitive marketplace.

MEETING THE CHALLENGES

As a result, despite all the promise of distributed generation, the current decentralized system will become increasingly unstable without the corresponding development of technical, market and regulatory frameworks over the next three to five years.

System management costs are increasing, and threats to system security are a growing concern as installed distributed generating capacity in some areas exceeds local peak demand. The amount of “regulating energy” provisions rises as stress on the system increases; meanwhile, governments continue to push for distributed resource penetration and launch new energy efficiency ideas.

At the same time, most of the large T&D utilities intend to launch new smart grid prototypes that, once stabilized, will be scalable to millions of connection points. The majority of these rollouts are expected to occur between 2010 and 2012.

From a functionality standpoint, the majority of these associated challenges are related to IT system scalability. The process will require applying existing algorithms and processes to generation activities, but in an expanded and more distributed manner.

The following new functions will be required to build a smart grid infrastructure that enables all of this:

New generation dispatch. This will enable utilities to expand their portfolios of current-generation dispatching tools to include schedule-generation assets for transmission and distribution. Utilities could thus better manage the growing number of parameters impacting the decision, including fuel options, maintenance strategies, the generation unit’s physical capability, weather, network constraints, load models, emissions (modeling, rights, trading) and market dynamics (indices, liquidity, volatility).

Renewable and demand-side dispatching systems. By expanding current energy management systems (EMS) capability and architecture, utilities should be able to scale to include millions of active producers and consumers. Resources will be distributed in real time by energy service companies, promoting the most eco-friendly portfolio dispatch methods based on contractual arrangements between the energy service providers and these distributed producers and consumers.

Integrated online asset management systems. new technology tools that help transmission grid operators assess the condition of their overall assets in real time will not only maximize asset usage, but will lead to better leveraging of utilities’ field forces. new standards such as IEC61850 offer opportunities to manage such models more centrally and more consistently.

Online stability and defense plans. The increasing penetration of renewable generation into grids combined with deregulation increases the need for fl ow control into interconnections between several transmission system operators (TSOs). Additionally, the industry requires improved “situation awareness” tools to be installed in the control centers of utilities operating in larger geographical markets. Although conventional transmission security steady state indicators have improved, utilities still need better early warning applications and adaptable defense plan systems.

MOVING TOWARDS A DISTRIBUTED FUTURE

As concerns about energy supply have increased worldwide, the focus on curbing demand has intensified. Regulatory bodies around the world are thus actively investigating smart meter options. But despite the benefits that smart meters promise, they also raise new challenges on the IT infrastructure side. Before each end-user is able to flexibly interact with the market and the distribution network operator, massive infrastructure re-engineering will be required.

nonetheless, energy systems throughout the world are already evolving from a centralized to a decentralized model. But to successfully complete this transition, utilities must implement active network management through their systems to enable a responsive and manageable alignment of supply and demand. By accomplishing this, energy producers and consumers alike can better match supply and demand, and drive the world toward sustainable energy conservation.