Posted by Muzzer on 23/04/2018 10:56:58:
Posted by not done it yet on 23/04/2018 10:12:31:
Posted by Neil Wyatt on 23/04/2018 09:27:16:
Posted by Mike Poole on 22/04/2018 22:24:35:
It is time we moved towards 220v as our standard voltage for real and not fiddling the figures to include 240v. It might take a few seconds longer to boil your 240v kettle but as old spec equipment works its way out of the system things should settle down. O
Energy companies oppose this as a resistive load like a heater, for example, uses nearly 20% more juice at 240V as at 220V.
9% more at 240V than 220V. There would be more leccy used to boil a kettle at 220V because it would take longer and thus lose more energy during that extra time
…
.
…But few heaters run without a thermostat so what difference does it make? The kettle will end up taking the same energy to boil as good as dammit. The amount of excess water you put in the kettle will have a much bigger effect. And the heater will take the same average power to keep the room warm.
…
Murray
Slight misunderstanding here. It's not about losses in the home, the supplier worries about power loss in his distribution network, not in your kettle. Dropping the volts costs the supplier money.
The issue lies in the relationship between Watts and Ohms Law.
- Watts = Volts x Amps
- Amps = Volts ÷ Ohms
- Watts = Amps x Amps ÷ Ohms
Assume your street draws 100kW of power, and the average resistance of the network is 10 ohms.
100kW at 220V is 454Amps, so the 10ohm resistance of the network means it will lose 4540W as waste heat.
100kW at 240V is 417Amps and the 10 ohm resistance of the network will lose 4170W as waste heat.
In this example, saving 400W (9%) may not seem much, but there are about 40,000,000 households in the UK. You could reduce the losses by installing fatter electric cables but copper is expensive; it's cheaper to up the volts.
Changing the subject slightly, does anyone know how various countries standardised on the various pressures and frequencies they've adopted?
In the UK, where a mish-mash of early local systems were replaced with a grid, I get the impression that they went for the maximum voltage you could put into a home without killing everybody who touched it! At the time 250V. 50Hz was chosen, I think, because it was the best that could be done with the largish number of early alternators available (magnetically rather than mechanically).
In the US I suspect the Edison/Westinghouse marketing fiasco had everybody so terrified of AC that they wimped out big time and standardised on the much safer 110V. By then it was also possible to generate power efficiently at 60Hz, and, as they didn't have a lot of elderly infrastructure to accommodate, they went for that, and saved dosh because 60Hz transformers can be made more cheaply.
I theorise 220V 2-phase became popular because110V has quite a few disadvantages other than being wasteful. 220V single phase has come to dominate in the world because local distribution networks cope with old 110V consumers while offering a good compromise for everybody else; not too dangerous, not too expensive, and not too wasteful.
Dave