what happens to excess electricity produced on the grid

1.39K viewsEngineeringOther

Since, and unless electricity has properties I’m not aware of, it’s not possible for electric power plants to produce only and EXACTLY the amount of electricity being drawn at an given time, and not having enough electricity for everyone is a VERY bad thing, I’m assuming the power plants produce enough electricity to meet a predicted average need plus a little extra margin. So, if this understanding is correct, where does that little extra margin go? And what kind of margin are we talking about?

In: Engineering

38 Answers

Anonymous 0 Comments

To add om top of others answers regarding synchronous generators, inertia and so on, I would add that there are kind of two types on power on the electricity grid; active power and reactive power.

Active power is the ‘usual’ power you already know of and is what is used by your light bulbs and what you pay for.

Reactive power on the other hand is a bit more weird. It is basically the power that is “left over” when voltage and current are out of phase in an AC grid. To give an ELI5 analogy: it’s kind of like stepping on the gas pedal in a car while the clutch is disengaged.

We need to have some reactive power in the grid, since induction motors use this reactive power to get started.

It’s also used to regulate the voltage and a few other things when doing transmission and distribution of electrical power.

It exactly the same as active power, in terms of voltage and current, but is just not useable for many electrical devices.

Anonymous 0 Comments

> I’m assuming the power plants produce enough electricity to meet a predicted average need plus a little extra margin

There are some good answers already, but they didn’t point out that this is where your error lies.
The power plants don’t produce any extra power just in case. Combined the total grid will produce exactly as much energy is needed.
Of course when the demand changes the plants will momentarily either produce too much or too little power, but they’ll adapt their output fast.

Also other people arleady mentioned inertia and a change in frequency if you either produce too much or too little power. I want to further add that this is kind of self balancing:
When you increase the net frequency this will make every direct driven motor on the grin spin faster as well (and therefore consume more power).
Also the voltage on the grid would rise slightly as well making many devices consume a bit more power as well.

Anonymous 0 Comments

You can bleed off steam before it reaches the turbine so it spins at exactly what you need it at.

Anonymous 0 Comments

It’s true that the classical grid (ignoring any batteries) produces exactly the amount of power that was being used. So if you imagine one gigawatt nuclear power plant and let’s say it’s running it 700 MW and you turn on 100 W load, then suddenly instantaneously that powerplants can reproducing , one 7,000,000,100 W and there’ll be an instantaneous tiny increase in the amount of torque being delivered to the generator. Impact because the generator armature is generally very massive, with very high rotational, inertia and very high angular momentum, smaller load can be switching and out and not even cause the generator slow down measurably. Now if you switch and say 100 MW load so you turn on a aluminum smelting furnace well the generator is going to be tending to slow down a little bit, but the generator is designed so that if it slows down even slightly, said more steam is applied and the speed is kept constant and the voltage is kept constant. More heat will need to be applied to the boiler by increasing nuclear activity or increasing the amount of oil or coal or gas being burned.

Anonymous 0 Comments

A lot of great responses here about how power demands increase or reduce the speed of the turbine generators. I work in nuclear power and I thought I’d offer some facts about how nuclear works that I find interesting.

There is a saying in nuclear that “reactor power follows load”. What this means is that if you have a reactor system operating at steady state, an increase or decrease in load demand will have an automatic increase or decrease in reactor power (within limits, or course).

I’m simplifying here, but the way it works is say you have a spike in demand. The increased load from the generator will translate to more energy being extracted from the steam moving through the turbines. This means that the water on the low pressure side of the turbine will be cooler. In PWR nuclear plants, this cooler water moves back into the steam generator. Since it’s cooler, it extracts more heat energy from the water loop that runs through the reactor (the reactor loop and turbine loop are separate in a PWR, with heat transfer happening at the steam generator). More energy being extracted from the reactor at the steam generator means that the water returning from the steam generator is cooler, where it heads back into the reactor to be heated by the fuel. PWRs are designed with a negative “moderator temperature coefficient”, which means that lower temperature moderator (water in this case) does a better job of moderating the neutrons, which is better for the chain reaction. So the cooler water returning to the reactor naturally adds reactivity to the core, increasing the power output.

All of the systems that are operating to deliver electricity are balanced, and if you tip one of the systems off balance a bit by changing the load demand, the systems are engineered to rebalance themselves by feedback mechanisms.

Anonymous 0 Comments

> it’s not possible for electric power plants to produce only and EXACTLY the amount of electricity being drawn at an given time

Your assumption is practically incorrect. The grid only works, because it is able to produce exactly what is demanded.

Well maybe not exactly, but it’s within say 0.1% of the demanded electricity.

> I’m assuming the power plants produce enough electricity to meet a predicted average need plus a little extra margin.

Nope, the total capacity of power plants is sized to handle the maximum possible/reasonable demand you will ever see (~3x average demand), but the real time power delivered is exactly what is needed, right in that moment.

The closest thing that occurs to what you are thinking of is what dispatchers do. These are technicians who decide when a power plant needs to be turned on, or where to direct or pull power. They’re actions though are not for real time regulation or to exactly match power, they’re to instead make sure a plant or feeder line isn’t being over stressed.

> So, if this understanding is correct, where does that little extra margin go?

Your understanding is not correct, there’s no little extra margin, except for essentially rounding errors. The grid is regulated in real time. The extra margin is what accounts for the slight variations in grid frequency over time.

I think your misunderstanding is due to not understanding how you can regulate such a giant system. **You are probably thinking you would need to know exactly how much power is being demanded, then using this information, make adjustments. But you actually don’t need to know anything about how how much power is being demanded.** Instead you can target a particular metric, in this case frequency, and ensure that frequency is locked at all times. This will inherently regulate generated power to equal demanded power.

Grid power is mostly generated with synchronous machines. These are electrical generators, which are fed an external current (excitation current), which the generator will synchronize to if you apply enough torque to its rotor to keep the rotor spinning at enough speed. So the grid voltage itself, is fed into the generator to synchronize the generator’s output frequency, to the grid’s frequency.

Synchronous motors will remain locked to the input frequency of the excitation current, unless you demand more or less power than what the motor is delivering. They do this by spinning their rotors up to a speed, where each rotation of the rotor, passes a number of magnetically charged poles, that results in a matching alternating current frequency to what the input excitation current has. A mechanical speed governor may be employed, to ensure the rotor speed stays locked at the ideal speed, to preserve the synchronized relationship between input excitation current frequency, and generated current frequency.

The resulting grid frequency is the product of the average power delivery, load demand and rotational characteristics of all the generators on the grid working together.

The grid is frequency regulated, it targets a fixed frequency, 60Hz in North America. By targeting frequency, this inherently allows the electricity generated to almost exactly equal electricity demanded.

If electricity generated exceeds what is demanded, the frequency will naturally rise. Put simply, if you demand less electricity than what your generators produce, that means the relative amount of torque load applied by the generator’s windings to the generator’s rotating shaft, is reduced in relation to the torque being applied by the power source to the generator’s input shaft. With less torque load, the generator is able to more freely spin at a faster speed. With a faster speed, the frequency will rise, since the rotor will pass over those magnetic poles more quickly.

Conversely, if you demand more electricity than is supplied, the frequency will reduce, since all the generators will slow down.

If the grid over produces, all of the synchronous machines (generators) hooked up to the grid will speed up. This will cause the grid frequency to rise. Since grid frequency is regulated, a control system will apply counteracting torque (via braking or reducing power source energy output) or lower the excitation current in their synchronous machines, to slow things down, and thus lower the generated power to match the demanded power.

Every single generator gets to see the grid’s frequency, and every single generator has a regulation system targeting the same frequency. Through completely independent action, but tied to the same feedback signal (grid frequency), the entire grid can be regulated.

**TLDR: The grid is frequency regulated. Frequency regulation forces generated power to almost exactly equal demanded power. You don’t need to actually know what power in and out is to regulate! Frequency fegulation is constantly occurring in real time, every generator on the grid is synchronized to the grid, and all are targeting the same frequency, 50 Hz or 60 Hz. Every generator will make micro-adjustments to its power output (throw more or less coal into the fire, or release/apply mechanical braking), to keep that frequency at 60 Hz.**

Anonymous 0 Comments

The standard in NA is to supply 120V AC to homes within +/- 10% (132V to 108V). Depending on the current draw of the system that voltage will drop or increase because of Ohm’s law (Power=Voltage x Current). The utilities aim for a power level that will provide enough current based on historical averages at 120Volts and if that level is not required the voltage goes up and sucks up the extra power. If it exceeds the average like during a heat wave when all the air conditioners are turned on, then the total current draw exceeds the available power and you see the voltage drop to below the 108VAC level and that causes a brown out.

For industries that require very stable power input or are prone to having spikes in their power consumption for their machinery they have banks of large capacitors installed to suck up additional power for reserve that can be used in the event of major swings in the power grid. Google the power triangle if you want to know about reactive power vs real power. The utilities also have massive banks of capacitors to help stabilize power draw.

So basically, any excess power is factored into the system by having a standard that allows for a range of power to be delivered to a household. Its up to the appliance manufacturer to cope with that range of power.

Anonymous 0 Comments

The grid produces electricity at a frequency. 60Hz or 50Hz depending on where you live.

I’m in the US, so it’s 60Hz, Europe uses 50Hz, Japan has multiple grids, some use 50Hz and some use 60Hz.

If more energy is being produced than used, the grid speeds up slightly.

If more energy is being used than produced, the grid slows down.

Thunk about it like the load on the grid has all the power plants pushing against it. If the power plants are pushing too hard, it goes too fast. If the load is pushing harder than the power plants are pushing back, it goes too slow.

Every power plant individually needs to keep track of the frequency of the grid and tweak their output accordingly. There’s also a lot of planning going into which power plants are going to come online when to try and predict the amount of energy that will be needed. The owner of the grid can also offer to pay more money for electricity at certain times, usually when demand is high.