As noted previously in these pages, the Internet and the digital economy are proving to be major drivers of increased electricity consumption. From nothing ten years ago the equipment on which the Internet runs is now estimated to account for an astonishing 8 per cent of US electricity consumption – at least according to some estimates.
Recently, CityReach International, a builder of internet exchanges, also called “internet hotels” (in which large groups of servers are installed, sharing common services and infrastructure, including electricity) reported that lack of sites with suitable power supplies was causing serious delays in the installation of these facilities. Take the case of London. According to CityReach International, “We should be learning from the problems already experienced in Silicon Valley, otherwise London risks being the loser as other European cities bend over backwards to protect inward investment by accommodating the power requirements of the internet economy.” And it is by no means just a question of quantity. The power needs of computers are driving power quality requirements and expectations up to unprecedented levels, while the overstretched power supply system is being asked to deliver a level of reliability it was never originally intended for. Lights, motors and furnaces can tolerate power disturbances that can be fatal for computers.
Writing recently in our sister publication, Power Economics, Mark P Mills observed: “The old electric grid is astonishingly reliable, considering its sprawl, available 99.9 per cent of the time. But this is a level of reliability literally orders of magnitude too poor for a microprocessor-based economy. Reliability at the 99.9999+ per cent level is the issue that rapidly emerges as the central subject and defining force that will reshape the global electric industry. Data centres and digital information warehouses, web hosts and telecomm buildings in the multi-megawatt range are now common and growing rapidly. The combination of these levels of demand with the reliability paradigm is deeply and permanently disruptive to the incumbent electric industry…”
As always, what is seen by some as a problem appears to others as an opportunity. The whole area of energy storage technology, for example, seems set to benefit from this growing emphasis on power quality and reliability. Makers of uninterruptible power supply (UPS) systems are of course ready to oblige, although it has been argued that achieving true “computer grade” power reliability will require some radical new thinking.
Caterpillar has recently announced that it is now offering a flywheel-based UPS, with an envisaged range of applications that includes “Internet data centres, telecommunications, broadcasting facilities and critical manufacturing processes.” As described elsewhere in this issue (see pp 21-23), Urenco sees great potential for its advanced flywheel energy storage technology, which was borne of the rigours of enriching uranium with ultra high speed centrifuges. Power quality is also one of the intended niches of the Regenesys fuel cell based energy storage system being developed by Innogy, with 15 MW showpiece installations planned for Little Barford in the UK and a TVA site in the USA.
The distributed power business, including such technologies as microturbines and fuel cells, expects to be a major beneficiary of the proliferating demand for increased power reliability, the “9’s” business, as some call it. A recent approval by the New York State Department of Public Service is being touted as a landmark development by proponents of distributed generation, paving the way for removal of what was seen as a major obstacle and providing a large boost for microturbines and other such small-scale energy technologies. Basically, the Department has approved the Capstone microturbine as the first three-phase distributed generation device permitted to operate in parallel with utilities under its jurisdiction. This ruling, under New York State’s pioneering distributed generation standard, eliminates the need for expensive, repetitive and time-consuming verification tests for each utility. Instead the approval has been based on a review by Underwriters Laboratories of the microturbine system’s “protective functionality”.
Promoters of distributed systems have long argued that their relatively high costs are not all to do with the technologies themselves but also derive from the cost of implementation, including the need to negotiate with each utility and to carry out separate tests for them.
With the New York State ruling, and with the issue of power system reliability rising to the top of the agenda, the climate continues to move in favour of distributed power in general, and microturbines in particular. While the microturbine business can already point to some very satisfied users (see, for example, p 31), less positive lessons from Vattenfall in Sweden should however also be taken on board, particularly if the intention is to target the “9’s” market.
Vattenfall ran a microturbine-based CHP demonstration unit in Gothenburg from 1997 until earlier this year. Although availability during the first two seasons of operation was quite high, mechanical problems during autumn 1999 and spring 2000 “dramatically impeded operation and the machine has been down for repair a large portion of this period”, according to Nicolas ter Wisscha of Vattenfall. Other difficulties included a fire in the electrics when the machine attempted to operate at part load.
It turns out that many of the difficulties stemmed from having oil-suspended bearings. This would suggest that air-bearings, as used in many modern microturbines, are preferable. Among the other conclusions from the Vattenfall experience was that the microturbine system was overly sensitive to mechanical failures and needed more maintenance than expected. This is important as the assumption of low cost operations and maintenance is crucial to the economic case for microturbines (and other forms of distributed generation). It is essential that they are robustly designed, with maximum component life, and can achieve the levels of reliability increasingly being taken for granted in the digital age.