Amory Lovins and the Soft Path, 40 years on

31 October 2016

Co-founder, chief scientist, and chairman emeritus of Rocky Mountain Institute in Colorado, USA, Amory Lovins has been analysing trends in the energy sector for four decades, with some considerable success. David Flin looks at some of these early prognostications and spoke to him about current drivers in energy, not least of which is the advent of the electric car.

Market forces and changing technical capabilities create a rapidly changing landscape, and companies either adapt to that environment, or go the way of the dinosaurs. In particular, many power utilities believe that their objective is to sell electricity to the consumer at a cost-effective price, and the more electricity they sell, the better they are doing. In doing this, they need to adapt to a shifting environment. However, part of the difficulty of adapting to a shifting environment is understanding how that environment is changing. It can be hard to appreciate just how quickly the world can change, but Stanford lecturer Tony Seba offers a striking example. In 1900, photographs of the Easter Day parade in New York are dominated by horses and horse-drawn vehicles, and there is hardly a car to be seen. By 1913, similar photographs are dominated by cars, and there is barely a horse to be seen.

Back in 1976, Amory Lovins wrote his landmark article “Energy Strategy: The Road Not Taken?”, describing two alternate energy paths, one a centralised energy system based on coal and nuclear power, which he called the Hard Path, and one, the Soft Path, concentrating on energy efficiency and renewable energy technologies. The soft energy path flowed from treating energy not as an end in itself but as a means to an end. A customer wants to use electricity to operate lights and computers and freezers and air conditioning and ovens; they buy the electricity only as a way to get those services.

Forty years on, and Lovins said that the analysis was very accurate with regard to the demand side, predicting significant improvements in efficiency and the fall in energy intensity, but was less accurate on the supply side. In the Summer 2016 issue of Solutions Journal, the house magazine of Rocky Mountain Institute, Lovins said that the analysis gave accurate foresight (within 1 per cent in 2000) about total US primary energy use. It gave a hypothetical energy intensity trajectory falling 72% in 50 years, compared with an actual fall of 56% in the first 40 years. The analysis suggested halving of energy intensity “around the turn of the century”. This actually occurred in 2008. In contrast, the official forecasts around 1975–6 proved too high by about 60–70%.

In 1976, Lovins’s article also said: “The commitment to a long- term coal economy many times the scale of today’s makes the doubling of atmospheric carbon dioxide concentration early in the next century virtually unavoidable, with the prospect then or soon thereafter of substantial and perhaps irreversible changes in global climate. Only the exact date of such changes is in question.”

In 1976, almost no one foresaw obtaining gas cheaply from deep formations not associated with oil. It is hard to remember that 40 years ago, natural gas was considered so scarce that federal US policy outlawed its use in power stations in 1978- 1987, and strongly promoted coal-fired generation instead.

Renewables faced difficulties, rather than being granted a permissive legislative platform, with the consequence that while renewables have gained ground, and are now mainstream, this happened 35 years later than Lovins’s soft-path graph proposed strong policy support could achieve.

This is what happens when the wrong question is answered. The problem that had been posed was how to get more energy, and planners extrapolated historic growth in energy demand and built supply to meet it. The correct question to have asked would have been what we want energy for, and how best to deliver those services. The customer wants to feel warm in winter, and options include increasing the insulation and airtightness of the house and making better use of the heating already in place in addition to adding more heating capability.

Edison was right

Henry Morton, President of the Stevens Institute of Technology, was asked for his views of Edison’s developments of the light bulb, said: “Everyone acquainted with the subject will recognise it as a conspicuous failure.”

Edison proposed that the company charge for the services it supplied, such as lighting, rather than the amount of electricity it supplied. This was not followed after 1892, and the model of charging for electricity delivered gained traction, resulting in utilities having a vested interest in not improving energy efficiency. If Edison’s suggestion had been followed, utilities would make increased profits by supplying energy efficient appliances, which would also reduce the need for capital investment in power plant infrastructure. Improvements in energy efficiency have been made despite this, but as Lovins points out, those have reduced utilities’ revenues not their costs.

Utilities have been perversely encouraged to supply more and more electricity in order to increase revenue, requiring large-scale projects undertaken with major social and financial commitments that do not see a return on investment for years, and this has led to the subsidies, $100 billion bailouts, massive regulatory regimes, and social disruption Lovins’s 1976 article warned about.

If we had asked the correct question in the first place, utilities might well be supplying ovens with rechargeable batteries to be connected up to small solar panels, and thus save the need for building expensive power plants, extending transmission and distribution networks, finding ways of balancing power demand and supply, and monitoring electricity use. The consumer ends up able to cook, doesn’t have to pay for the supply of electricity, and is not affected if a tree falls on a power line 50 miles away.

Twenty years ago

In 1996, it would have been a stretch to see the cost of renewable generation falling to the extent that it has. Unsubsidised solar PV in Mexico has been contracted for at a long-term constant price of 3.55 US cents/kWh, while in other global auctions, unsubsidised onshore wind generation has fallen to 3 US cents/ kWh. By 2015, all renewables supplied almost 22% of global electricity generation and over half of new capacity additions.

At the same time as renewables have entered the mainstream and become cost-competitive, nuclear power has entered into a, possibly terminal, decline. Outside China, nuclear power generation has declined over the 20 years, and even within China, wind power is outstripping nuclear generation by a considerable margin. We took for granted that the operating expenses of nuclear would remain low, says Lovins, but that has proven to be not the case. In 2010–12, the average OpEx at a top-quartile- OpEx nuclear plant in the USA was $62/MWh, making it non- competitive. As plants get older, they require more maintenance and repair. Even the average US OpEx is nearly $40/MWh.

Improved rotor blades and greater tower heights have lately enabled wind turbines to be optimised for moderate Class 3 wind profiles, rather than being limited to powerful Class 5+ wind profiles. This makes many more sites viable (now in every US state), and gives the option of locating viable wind turbines close to a demand centre, minimising new transmission networks.

Capacity factors for renewables are still increasing, further lowering the cost of renewable energy.

Where do we go from here?

Charles Duell, Commissioner of the US Office of Patents, said in 1899: “Everything that can be invented has already been invented.”

The day of large, centralised power stations is fading, as are national dependencies on a single, national grid. Instead, we shall see moves towards interlocking micro-grids, which can isolate or be isolated from a main grid, with a wide variety of generators connecting local capacity to the point of use. This will increase the resilience and security of the network, and will avoid the problems that have been seen where an event, be it natural, accident, or deliberate attack, brings down a whole network.

This would eliminate situations like the 1999 south Brazil blackout, or the 2003 northeast US blackout, where a single source problem cascaded through the system. The power outage in Ukraine in December 2015 resulting from a sophisticated cyber-attack against distribution control systems exemplifies the danger of relying on a single grid network.

The growth in a variety of groups intent on causing harm is a concern, and it makes little sense to have a single system that can be targeted, brought down, with consequent major disruption to millions of people—as Lovins analysed at length in Brittle Power, a 1982 book for the Pentagon.

Denmark has been focused on developing a Smart Grid, which is designed to optimise energy efficiencies and provide robust security to the network. The plan is to use high levels of metering and communication technology in homes to balance fluctuating supply and demand, and increasing the flexibility of the system. According to the Danish Ministry of Climate, Energy, and Building: “The electric system only works if there is enough capacity to supply the ancillary technical services. Large central power plants have thus far supplied most of these necessary technical services. As the large power plants are expected to be phased out, these properties must be ensured in other ways. In the long term, the Smart Grid has the technological potential to provide some of these services to the power grid.”

Energy/IT mash-up

In 1995, Newsweek wrote: “Nicholas Negroponte, Director of the MIT Media Lab, predicts that we’ll soon buy books and newspapers over the Internet. Uh, sure. My local mall does more business in an afternoon than the entire Internet handles in a month.”

The rapid rise in information and communication technology has had major effects, enabling an astonishing level of individual control. Bi-directional tariffs enabling both buying and selling of electricity are no longer cutting edge, and we are seeing developments enabling households to buy and sell electricity directly with neighbours. This enables many exchanges to take place without the need for intervening third parties.

Utilities have to jump across the meter. Customers are already figuring out how to make, buy, and sell electricity, and attempts by utilities to prevent this by regulation or discriminatory tariffs will fail dismally, and will only have the effect of annoying customers, making them even more determined to find cost- effective ways of bypassing the utility. Utilities will have to learn to embrace this change, and work out ways of making it work for them. 

Individual electricity exchange will help balance supply and demand, as it will enable myriad small exchanges. Electricity swapping and direct customer-to-customer sales (as on the Dutch site Vandebron) could become as common as peer-to-peer exchanges become the norm. Smart charging of batteries plugged in overnight already allows the charger to optimise charging rates according to the price and availability of electricity.

What could cause a movement towards mass use of smart chargers to recharge batteries? If such a thing were to become ubiquitous, it would revolutionise the demand side of the electricity equation. What trend could bring this about?

Electric cars

“The horse is here to stay, but the automobile is only a novelty, a fad.” This was how the President of Michigan Savings Bank advised Henry Ford’s lawyer in 1903 when asked about the advisability of investing in the Ford Motor Company.

The sales of electric vehicles are growing at 60% per year, and are being particularly encouraged in China and India. The stock of electric vehicles currently represent only 0.1% of the total stock of vehicles on the road, but this proportion is rapidly increasing. In 1900, very few people expected the horse to be replaced by the automobile. By 1913, horses had become obsolete as a means of city transport. Tony Seba reports that the fraction of US households owning a car rose from 8% to 80% in just a decade, 1918–28, three-fourths via an innovation called “car loans”.

It needs to be remembered that in the 1980s, renewable energy was a fringe energy source that was typically only used in situations where traditional large power plants were not viable, yet within 20 years it had become mainstream, and within 30 years, it was the dominant force in many countries. When change happens, it can take place with astonishing rapidity. This will apply to the electric vehicle, and it will have a number of consequences. The greater the number of electric vehicles operating, the greater the pressure to improve the technology involved.

Battery technology will advance dramatically, and batteries will become lighter, easier and quicker to recharge, and cheaper. This battery technology will be easily transferable to stationary applications, and with production volumes around 1 TWh pa by the mid-2020s, will affect both the supply and demand sides of electricity.

Combined with smart chargers, the demand side will lose the deeper troughs, as this will be when recharging takes place. Increased battery capacity will enable variable renewable supply in low demand periods to charge batteries for use in peak demand periods, which in turn will increase the productivity of these sources. The argument against renewables that they are limited by variability, which is already a dubious proposition, will become patent nonsense.

Manufacturers are experimenting with making electric vehicles lighter through using carbon fibre for the structure, eliminating more than half the costly batteries. As sales grow, more work will go into making the vehicles lighter, cheaper, and more efficient, which will have knock-on effects for developments in materials science, which could have incalculable effects in other fields.

The development of smart chargers in large numbers for individuals will also provide encouragement for the increased application of domestic renewable generation systems. Such systems are already cost-competitive with grid power, and greater demand will bring about further improvements, which will improve their economics, which in turn will reduce the share of large centralised power plants.

Insurgents and incumbents

When circumstances are changing rapidly, new entrants to the sector have the advantage of not being encumbered by obsolete infrastructure, business models, and cultures. Incumbents tend to believe that they can control the market by imposing restrictions to new entrants, but this will not work for long.

Most insurgents will be comparatively short of capital, so capital productivity will be vital to their success. They will build smaller units that come on line and produce a return more quickly, leading to distributed generation. This ties in with the model of multiple interlocked micro-grids taking over from a single, monolithic national grid.

Using what Lovins calls Negawatts (increasing energy efficiency to gain the same amount of services out of less electricity) and Flexiwatts (using electricity more timely to better balance surplus and shortage) can defer capital investment for a considerable period.

The imperatives of capital investment will drive developments in the industry. Small, diverse, flexible generation systems connected to interlocking microgrids, combined with smart controls, smart chargers, cheap batteries, and web transferral systems, are the future. This broad trend is unavoidable, although some of the details may be contingent.

Contrarian options

Simon Cameron, a US Senator, said in 1901: “I am tired of this sort of thing called science. We have spent millions on this in the last few years, and it is time it should be stopped.”

Inevitably, some energy policy makers take a contrarian view, and devise policy that seems to be based more on trying to turn failed policy initiatives into successes by increasing investment into them, and reduce investment into areas that are proving successful.

For example, building large-scale new nuclear facilities under dubious financing strategies, while at the same time reducing support for energy efficiency and small-scale renewables, is a recipe that will demonstrate the dictum that: “However bad things are, they can always get worse.”

Consumers have increasing access to information and options, and reality will always assert itself against policy based on wishful thinking and ego-based mega-projects.

It is perhaps worth noting that PG&E has announced its intention to shut down Diablo Canyon, California’s last operating nuclear power plant, closing both units by 2025, and replacing the 2160 MW output with a combination of renewable sources, energy storage, better energy efficiency, and changes to the power grid. Officials from PG&E said that re-licensing the plant and operating to 2044 would be more expensive than the proposed plan, which will lower demand from customers, and take advantage of declining costs for renewable power – exactly the logic laid out 40 years ago in Lovins’ The Road Not Taken

Energy Forecasting Amory Lovins: 40 years of analysing trends in the energy sector (photo, Judy Hill Lovins)
Energy Forecasting 1900 New York Easter Day Parade
Energy Forecasting 1913 New York Easter Day Parade
Energy Forecasting Global annual sales of light-duty plug-in electric vehicles (2011-2015)

Linkedin Linkedin   
Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.