Bryan Leyland, Industry Consultant
Consultant Bryan Leyland takes a broad view of modern networking and distribution and applies it to the situation in New Zealand. The Electricity Network Association declined to contribute a perspective this year.
EXCEPT IN SPECIAL CIRCUMSTANCES, alternating current (AC) is used to transmit large amounts of power for long distances because transformers are a simple and economical solution to the problem of converting the low voltage power produced by generators to the high-voltage power needed for transmission and to bring it down to distribution voltages at the receiving station. The technology is well proven and has not changed significantly for many years.
Since the 1960s high voltage direct current (DC) has been used for undersea cables where AC cables cannot be used because of high charging currents and, more recently, for long distance point-to-point transmission using overhead lines. Its advantages are a substantial reduction in the cost of the line – two conductors rather than three, elimination of the skin effect and elimination of the problems associated with reactive power flows. Its one disadvantage is the high cost of the terminal equipment but, as technology advances, the cost is reducing steadily.
DC has not yet been used for interconnected networks because, until recently, there have been no circuit breakers capable of breaking high DC currents at high voltages. AC circuit breakers break the current flow at a current zero. Their main problem is re-establishing sufficient voltage across the opening contacts to prevent re-strike. DC circuit breakers have to break the current flow.
In 2012, ABB made a technological breakthrough when it developed a hybrid circuit breaker that combined a mechanical switch that normally carried the current with a solid-state switch in parallel. When there is a need to switch off the current the mechanical switch opens and transfers the current to the solid-state switch, which is able to break the current.
There seems to be little doubt that, in a few years, we will begin to see large-scale interconnected networks using high-voltage (>400kV) DC. In the short term, it is likely that existing high-voltage grids will be reinforced with DC links and, in the longer term, we may even see 220kV+ AC grid systems covering large areas converted to DC. This will substantially increase transmission capacity, eliminate reactive power transfer problems and reduce losses.
About 20 years ago ECNZ seriously considered extending one pole of the DC line up to Auckland using one of the existing 220kV AC lines. There is a high possibility that the 400kV line will one day operate at something like +/-350kV DC rather than 400kV AC.
Distribution technology has barely changed in the past 100 years. Sub transmission systems distribute power to zone substations where it is transformed to medium voltage in the 6.6kV to 33kV range to distribution transformers that, in most countries, distribute power to about 200 domestic consumers at 230/400V. Countries with systems based on US practice distribute at 110/220V and a distribution transformer usually supplies four to 10 consumers.
The 230/400V system has stood the test of time and could continue providing power as it always has into the foreseeable future. The 110/220V system already has serious problems because of the high number of transformers and the high electricity demands from air conditioning and, more recently, heating using heat pumps.
However, as more and more domestic appliances use DC or frequency converters it is possible to contemplate a change to distributing at a DC voltage in the 300V to 600V range. This would supply large loads and could be converted to a suitable DC voltage – perhaps ±24V DC to run small appliances, lighting, and electronic equipment. Alternatively, 3 phase 660V AC could be used up and down streets with individual transformers for each house.
The widespread adoption of electric cars and domestic solar power could force major and expensive changes on the distribution system. It was predicted that there would be one million electric cars in the USA by the end of 2015; the actual figure was 308,000 and the rate of uptake is falling off.
If the number of electric cars exceeds say, 20 percent of the total, then the (typically) 10kW of extra load they place on the grid during battery charging will become significant. If they travel 60 kilometres per day, it would double household electricity consumption and this would require major and expensive reinforcement of the distribution system. While there is a widespread belief that they will always be recharged in the early hours of the morning, a significant number of people will choose to charge as soon as they get home – either because they do not care about the extra cost or they are going out in the evening. In systems with a 110/220V system, a 25 percent uptake of electric cars could force the replacement of huge numbers of distribution transformers and the installation of larger cables. The cost would be high and it would be imposed on all consumers.
It is often claimed that electric car batteries could be used to support the grid during peak demand periods. The cost of battery storage is high because, for instance, a typical battery stores 24kWh, costs $15,000 and lasts for about 1000 charge/discharge cycles. The cost of one charge/discharge cycle is $15 or 80 cents/kWh – three times the cost of power from the grid!
My assessment is that the number of electric cars is not likely to increase as predicted because they are a solution in search of a problem and rely around the world on government subsidies.
Likewise, domestic solar power is expensive – about $3300/kW for a capacity factor of around 15 percent. Around the rest of the world, it is also heavily subsidised. In Australia installations exporting power into the grid between the hours of 9am to 3pm can cause an unacceptable rise in the voltage of the distribution system. To mitigate these problems Australian regulations require that solar panels must shut down if the voltage becomes dangerously high so reducing the benefit of solar power. New Zealand would probably have to do the same if we see an increase in domestic solar.
“Smart grids” are often regarded as a wonderful collection of new technologies that will revolutionise the electricity industry. This is not necessarily so.
Grid systems have always been smart using the best available technology at the time. This arose from necessity because something as complex as an integrated power system has to be very smart to isolate faulty sections and minimise the chance of a total blackout.
Smart grids usually refer to systems with smart metering and very complex monitoring and control systems to manage the effect of solar and wind power feeding into distribution systems to stop them causing serious problems with voltage and power flow.
All smart meters can communicate with a central point to report on power consumption. The main advantage of this is that it results in savings of $10-$20 per year in meter reading. Other advantages are that it can give lines companies better information on the loading of lines and transformers and allows consumers to see what their electricity consumption and power price is at any time. It is widely believed that smart meters can also control the consumers’ load. In practice, this does not happen because new technology is needed in individual appliances so that they can respond when the price is high. There do not appear to be products, or even agreed standards, for this and, it appears, distributors and retailers prefer to sell more power and, to a large extent, consumers do not care.
Smart grids are also promoted as a solution to the problems generated by domestic solar installations. The promoters envisage a system where vast amounts of information are exchanged and complex optimising programs adjust the system to match the current situation. This is a complex and expensive exercise that is needed only because of the existence of expensive and ineffective solar power that we would be better off without. So, once again, it is a solution in search of a problem. Or, putting it another way, a solution to a problem that, in a rational world, would never have existed. (Issues of system security from hacking are now becoming apparent.)
“Demand-side management” is the current buzzword for what used to be called peak load control.
Since the 1930s New Zealand has used insulated storage water heaters to manage peak demand. In the 1950s ripple control that sent signals over the mains to switch water heaters on and off became virtually universal. This system was probably the best in the world and it could hold the country’s load virtually constant between 8am and 8pm on peak demand days.
Unfortunately the regulations associated with the electricity market removed the incentives lines companies had to manage peak demand and now the upper South Island lines companies are the only ones that act against their own commercial interests and engage in large-scale peak demand control. On a peak demand day they are able to hold the demand constant from 7am to 10.30pm. Without it the 220kV lines from Christchurch to Nelson would have needed a very expensive upgrade.
If the lines companies in the upper North Island had also acted in the consumers’ interest the billion-dollar 400kV line may not have been needed.
The sad part about all this is that there are technologies available that are very smart and could lead to large savings for consumers. For instance, many existing ripple control relays have frequency sensitive elements. If they were set to switch off water heaters, refrigeration plant, and even air conditioning plants for a short period when the frequency dropped it could easily save the need for something like 200MW of spinning reserve. But nobody has explored this option because, it seems, it would be difficult to set up within the structure of our existing electricity market.
But there are even better options available such as smart water heater thermostats that are connected to the internet and also respond to system frequency. If frequency dropped, they would reduce power input to the heater in proportion to the frequency. If the frequency increases then, if the water temperature is normal, they will inject extra energy into the water to absorb the surplus energy on the system. The costs of managing system frequency would drop dramatically. They would also allow load control by the retailer, consumer, lines companies and, in emergencies by the system operator. It would be easy to implement and save consumers millions of dollars. There appears to be no way of implementing it under the present electricity market structure. Such is progress! Once again, the consumer loses.