I’m confused why the term watt-hours (Wh) is used. Watts is joules per second. So watt-hours is joules per second per hour. There are 2 time measurements within the same term.
Why not just use joules for total amount of energy used and watts for consistent/average output over a specific time? I assume watt-hours is how many watts are consistently produced over an hour period but it is used often in unhelpful scenarios. When talking about say solar generation, someone could say “my solar array produces 12kWh every day”. So 12kWh for 24 hours means your solar array produces 0.5kW of power for the entire day. How was watt-hours helpful in anyway to describe the solar array’s power output?
Or when talking about a cities power output, the reports are measured in Gigawatt-hours over the entire year. Why is quantifying a cities yearly output over an hour long period helpful?
Now if we compare the 2 given examples, it becomes even more confusing. If I had a solar array that produces 12kWh every day, how many solar arrays would I need to power a city the needs 5000 GWh every year? 5000 GWh every year is around 570.78 MW, so if I just used the standard watts over watt-hours I would have a simple convertion between scenarios while still having the option to say “0.5kWs each day” or “570.78MWs for a year”.
Sorry if this is sounding like a rant post but I’m really annoyed at this term.
In: 2
Solar output is rarely consistent throughout the day, however some cases all of the panels output goes to the grid, and is not used by the consumer.
So, they don’t care at all that it made 3kW at 9AM, and 9kW at 11AM, just that over the whole day it put out xkWh, as that is what will be being paid for.
Latest Answers