August 14, 2003. It was one of those hot afternoons in late summer, and the New York City subway was gearing up for rush hour. Suddenly, trains groaned to a halt. Above ground, people surged out of unlit buildings onto crowded streets. Cars, taxis, and buses swerved and honked, negotiating the human outpouring. Traffic lights, dark and unresponsive, swayed above the chaos. Gridlock was immediate.
And, ironically, gridlock was the cause. The great northeast blackout that affected 50 million people actually began in Ohio, where a few sagging power lines touched off a chain reaction. Like falling dominos, eight states and southeastern Canada plunged into darkness. Nuclear power plants shut down in New York state and Ohio, while in the air, planes were rerouted from Ottawa to Cleveland. Industry and government officials pointed a blaming finger at an interconnecting but archaic grid system.
When Richard Hirsh, professor of history at Virginia Tech, learned of the blackout, he registered little surprise. Hirsh, a scholar of technology through the ages, says, “Our electricity system, begun a century ago, consists of a hugely complicated and interdependent network of large centralized power plants and wires (called a transmission grid) to bring the power to customers. An accident within such a complex system was almost inevitable.”
But Hirsh isn’t just a scholar of history with an informed opinion on blackouts; he is proactive in the quest for a better grid and leads an interdisciplinary team of Virginia Tech researchers in a search for solutions to this many-layered problem. Funded by the National Science Foundation (NSF), members of the Consortium for Energy Restructuring (CER) at Virginia Tech are trying to encourage the use of small-scale generation technologies as part of an approach to create a more secure and efficient electric utility system. Group members include academics and graduate students from disciplines that span engineering, business, consumer affairs, and science and technology studies.
The Consortium for Energy Restructuring
While today’s utility system depends on aging power plants and vulnerable long-distance transmission lines, the future scheme, according to Virginia Tech’s consortium, will rely, at least in part, on distributed generation technologies that will provide power more efficiently, more cleanly, and potentially at lower cost than traditional generation technologies. So far, these potential advantages have been inhibited by technical limitations, the lack of attractive business models, and general consumer confusion. Taking a holistic approach, however, the Virginia Tech team integrates power electronics technologies with novel business prototypes, as well as social science and consumer research. The CER incorporates new grid-interface concepts, a computerized business simulator, and public policy and consumer information that can be translated into testimony for legislatures and regulatory agencies. Ultimately, the work will also be used as a series of educational modules for Virginia Tech students and energy managers of the future.
Creating a new electric power system archetype will require a robust network of distributed generation (DG) sources, such as fuel cells, micro-turbines, and renewable wind and solar energies, along with traditional centralized power plants using fossil fuels. Recent progress in small-scale generation technologies has suggested the possibility of ownership and operation of generation units by manufacturing facilities, industrial parks, office parks, hospitals, universities, government agencies, housing subdivisions, apartment complexes, electric cooperatives, and other consumer organizations. These owners will benefit from a non-interruptible power supply at a reasonable price and the opportunity to sell excess power into the power markets.
So why aren’t small-scale generation technologies widely used? They each require customized power-conversion devices to interface with the grid.
Under the guidance of Associate Professor Fred Wang and his colleagues at the Center for Power Electronics Systems, the Virginia Tech team envisions user-friendly “plug and play” grid-interface connectors. The CER engineers have proposed improvements to existing power electronics building blocks (PEBBs) with advanced digital protection to enable easy installation of distributed generation technologies into the power grid. Basically, with embedded intelligence for grid interface, PEBBs become SPEBBs (smart power electronics building blocks). These smart modules will facilitate a simpler communication structure.
“Instead of being custom-made for each application, the SPEBBs can be mass-produced,” says Wang, “thus reducing human error and increasing dependability at greatly reduced costs.”
In addition, the Virginia Tech researchers advocate the use of smart reconfigurable protection modules (SRPM). An SRPM consists of a digital protection system with embedded intelligence and communication channels that can adapt power relay functions in response to an unexpected change in the system at a DG site, says Virgilio Centeno, assistant professor of electrical engineering. “An example would be the loss of one of the transmission lines that bring power to a distribution system,” he says. “Such an event would cause an energy imbalance, forcing other DG units and the remaining transmission system to make up for the loss. The use of SPEBBs and SRPM together will significantly improve system efficiency and security, a growing concern in our post-9/11 world.”
The best way to encourage the use of DG technologies is to show that they are economically viable. The consortium’s business group members want to make that dream a reality. They are developing business models by building a computer simulation software package. The Virginia Tech Electricity Grid and Market Simulator “is a robust tool for educational case studies in the field of power systems design, engineering, and management,” says Ralph Badinelli, professor of business information technology. “Computer simulation is unique among modeling techniques to support case analysis and decision modeling for complex systems.”
This educational tool supports various decision-making domains, including:
• Public policy decision problems – balancing environmental guidelines, market regulation, and infrastructure planning
• Engineering decision problems – protecting the design of the system, selecting transmission and generation technology
• Business decision problems – generation capacity planning, unit commitment, optimal dispatch, demand side response, trading strategy, financial risk management
For each of these decision problems, key performance indicators, such as long-range cost, reliability, pollution, market equity, and financial risk, must be analyzed.
“Decisions related to power systems have outcomes that play out over a long time and can take on many scenarios due to randomness in the parameters, such as load and market prices,” says Badinelli. These relationships are typically complex and nonlinear. In a simulation model of a power system, energy flows through each network element, and the cash flows associated with the power flows can be modeled for each hour of every day. From hour to hour, the simulation program updates the status of each generation unit and stores this status in computer files. From these outputs, the key performance indicators of the power system are computed.
As a practical matter, a computerized educational support system is effective only if its interfaces are transparent and easy to learn. Coupling a geographic information system (GIS) to the simulation program provides a seamless interface for the user between data entry and key performance indicators.
The grid and public policy
Since Thomas Edison built the world’s first generating plant on Pearl Street in New York City’s financial district early in the 20th century, most electric power has been created by large plants outside of cities and sent through distribution lines to customers.
“Monopoly utilities built long-distance transmission lines to connect one power company to another, helping to ensure power supplies in case of emergencies, in a system supported by a web of federal and state government regulations,” says Hirsh. “These laws protected consumers but also ensured that the companies made enough money to allow for further expansion.”
The energy crises of the 1970s and ’80s sent policymakers in search of a new fix. Given that period’s interest in deregulating industries, such as telecommunications and aviation, the policymakers reasoned that if the electric power system were regulated less and allowed to operate in a free market, then a better product should be available at a lower price, says Hirsh.
Deregulation of power got a big boost in 1996 when California consumers were permitted to shun their former monopoly provider and select another producer of power. Although electricity was ultimately delivered through the monopoly’s wires, customers could choose who produced their power, electing companies that used renewable energy technologies, for example. By 2000, due to a variety of factors — including power companies looking to sell their product elsewhere, a drought in the Northwest, rising natural gas prices, and the lack of long term contracts — California’s wholesale prices quintupled and electricity supplies dwindled. The state imposed price caps to protect consumers, but one company went bankrupt and rolling blackouts were used as a way to ration electricity, according to Hirsh.
After that, says Hirsh, “lawmakers proceeded warily. Some states have moved toward more regulation, others toward less. Still in flux, companies and consumers don’t know if they will be operating in a competitive market, a more regulated environment, or some mixture of the two.”
In a partially deregulated environment, there is growing concern over how transmission systems are monitored and who pays for maintenance and expansion, he says. In some states, the transmission system moves electricity primarily within state borders and is run by a state agency. “While the utilities own the lines, they do not have control over them and are therefore not eager to invest the billions needed to keep the system viable.”
In other states, rates remain regulated by the Federal Energy Regulatory Commission, and utilities still own and manage the transfer of power through them. “The lack of investment in the transmission system for more than 20 years, plus the fact that the move to competitive markets requires much more movement of electricity, contributed to the great northeastern blackout of 2003,” says Irene Leech, president of the Consumer Federation of America.
Some of the grid’s problems could be alleviated through the use of distributed generation (DG) technologies. “It makes sense to begin moving toward a decentralized system that contains small-scale, modular, and diverse types of equipment that produce power close to cities or even within buildings that use a lot of electricity,” says Hirsh. “Employing diesel generators, or better yet — from an environmental point of view — fuel cells, micro turbines, wind turbines, and photovoltaic cells, such a system would reduce the strain on the existing grid by providing power to users without depending on transmission lines at all.”
The economic viability of DG technologies has improved in recent years. Enhancements in net metering, fuel conversion technology, and thermal engineering have accompanied developments in automation and control, improving efficiency and reducing the need for periodic maintenance.
Consumer advocates note that DG technologies are more efficient, since transmission from large, centralized plants typically wastes up to 9 percent of electricity due to wire resistance, inconsistent enforcement of reliability guidelines, and growing bottlenecks. Beyond efficiency, DG technologies can provide more reliable power for digital and telecommunications industries that require uninterrupted service. In addition, by producing local power, DG can decongest the grid by reducing demand during peak times. Decentralized power generation also reduces the terrorist targets that are offered by large nuclear and conventional facilities and natural gas refineries.
Educating the consumer
While Hirsh seeks to discern the non-technical impediments to implementing DG technologies, a major hurdle that remains is the public’s lack of understanding about how the utility system works.
Irene Leech and JoAnn Emmel, professors in the College of Liberal Arts and Human Sciences, spearhead the consumer affairs research component of the Consortium for Energy Restructuring. Gauging consumer understanding of DG technologies is important for two reasons, says Leech. First, consumers are impacted by higher or lower electricity costs and need to perceive the benefits or disadvantages when DG technologies are located near them. Second, energy consumers are active agents in a political constituency. They can influence the policymakers and regulators who have power to erect or dismantle impediments to using new technologies.
In Virginia, residential retail electricity rates are below the national average, with the western part of the state enjoying rates among the lowest in the nation. Consequently, the group’s research demonstrates that Virginia consumers have had little incentive to learn how power is produced or distributed. Electricity restructuring officially began in 1999 in Virginia, but according to Leech, the news media have not covered the issue well, deeming it “too complex for citizens to understand” and avoiding the topic. “As a result, most Virginians do not realize that the rules are changing.”
“Many consumers in the United States realize that energy restructuring has had many problems,” says Leech. “They are aware of the failure of Enron, the associated accounting crisis, and utilities involved in unfair trades. Unless they have been directly affected, however, most consumers are not familiar with the issues or the opportunities made possible by energy restructuring.”
“To be successful,” she says, “a DG strategy must be understood and accepted by the public.”
What captures consumers’ attention? Infrastructural changes draw notice, particularly ones that spoil the view. For example, the route of large transmission lines created a “not in my back yard” mentality a few years back. “Yet, few people realize that the use of DG technologies would mean fewer power lines and large plants,” says Emmel. “Encountering new and complicated bills, where customers receive several invoices for what appears to be one service, will also influence public understanding. Price volatility definitely makes consumers take notice, while environmental impact and cleaner, renewable resources may influence others to become more informed.”
A Virginia pilot program to test processes needed to develop a competitive market drew extremely limited participation, largely due to the unwillingness of incumbent providers to accept reduced revenue or to give up market share. Competitive offers to consumers failed to arrive, since no new service provider felt it could battle the low rates. Another state education campaign, known as Energy Choice, was killed after two years through the loss of government resources, despite the fact that the first report revealed how little Virginians knew about electric utility restructuring.
Now, two years after the program’s suspension, it appears that a competitive market will not develop in the near future. “Decision makers do not want to be accused of misleading consumers about the availability of a competitive market,” says Leech, ”so they extended the transition period by three years until 2010.”
The consortium’s work on the business and consumer elements of distributed generation technologies highlights the social nature of technological change. While seeking to develop new hardware that will allow DG entrepreneurs to connect seamlessly to the grid, the group realizes that non-technical factors seem to be greater impediments to the acceptance of the small-scale generation technologies. As one example, the researchers point to the difficulty encountered by a national engineering committee in creating standards for interface technologies that would allow DG users to plug into the grid. With a 400-person committee representing scores of investor-owned and public utilities, manufacturers, government labs, and state agencies, consensus will be difficult to obtain.
”DG technologies have also been inhibited by the inertia of financial subsidies and incentives,” says Leech. The tax code for the years 2000-04, for example, included $13 billion for production, research, and development credits in the energy industries. Of that amount, $10.2 billion went to traditional fossil fuel technologies, while nuclear power received $1.5 billion. Renewable energy technologies — a form of DG — were allocated just $400 million. “Similarly, lingering monopoly rules and discriminatory rate structures — often taking the form of exit fees, backup tariffs, and connection surcharges — still exist in many states and create political obstacles to investments in DG technologies,” she says.
The tasks of the Virginia Tech consortium are considerable. There is the technical conundrum of plugging in. Then, there is the development of a complex, multileveled business policy. The social, political, and consumer issues are also intricate. But there is hope in the form of education. The Consortium for Energy Restructuring is assembling an interdisciplinary minor in energy management at Virginia Tech. It is intended to serve as a model for other universities and to educate professionals in the changing energy industry. Thanks to this collaborative research and ongoing curriculum development, there is light at the end of the blackout.
— Jean Elliott, College of Liberal Arts and Human Sciences