TLDR: what dictates the amount of power consumed by heating/cooling utilities?
We all grew up knowing that touching the thermo was a no no. And that just slightly adjusting it away from a nominal temperature would raise the bill significantly. But why is this?
In my head I think of it like this. If a house is 80 degrees F, and you turn the ac down to 70. Then obviously a lot of power will be used to get the initial temp to 70. And it would make sense that if you kept turning the ac off and letting your house go back up to 80 and then turning it on again, then you would use considerably more power. Due to the large difference being displaced each time.
But let’s say the temp of the house was 71, you turn the temp to 70. It’s only gotta cool 1 degree. No big deal right?
Okay well if you keep your ac/heat set to a certain temp – then your ac kicks on every time the temp increases/ decreases past the set temp. Let say for conversation it kicks on after 1 degree difference.
If that’s the case then why is it so much harder to keep a house cool at 65 vs 75 in the summer, or 75 vs 65 in the winter. If the temp only moved 1 degree before kicking on then like what’s the difference?
In: Other
“Getting the initial temperature” is a shortcut to getting confused. There is a way to approach this problem that avoids it. The heat that enters or leaves the building to the outside depends only on the temperature difference. If the building is closer to the outdoor temperature, less heat moves. On a cold day: The energy saved letting it cool and the energy spent heating back up cancel out, while the energy saved *while it is cool* is indeed energy saved.
Latest Answers