TLDR: what dictates the amount of power consumed by heating/cooling utilities?
We all grew up knowing that touching the thermo was a no no. And that just slightly adjusting it away from a nominal temperature would raise the bill significantly. But why is this?
In my head I think of it like this. If a house is 80 degrees F, and you turn the ac down to 70. Then obviously a lot of power will be used to get the initial temp to 70. And it would make sense that if you kept turning the ac off and letting your house go back up to 80 and then turning it on again, then you would use considerably more power. Due to the large difference being displaced each time.
But let’s say the temp of the house was 71, you turn the temp to 70. It’s only gotta cool 1 degree. No big deal right?
Okay well if you keep your ac/heat set to a certain temp – then your ac kicks on every time the temp increases/ decreases past the set temp. Let say for conversation it kicks on after 1 degree difference.
If that’s the case then why is it so much harder to keep a house cool at 65 vs 75 in the summer, or 75 vs 65 in the winter. If the temp only moved 1 degree before kicking on then like what’s the difference?
In: Other
Because at 65 degrees the heat is trying to get in your house much faster than at 75 degrees. You’re comparing the wrong numbers, its not 65 and 66 vs 75 and 76 degrees, its 65 and 75 vs the outside temperature thats trying to heat your house up. Heat exchange is based on a difference of potential, the bigger the difference in temperature the faster heat will transfer. At 65 your ac has to fight harder against the high outside temp than at 75 and will run more often
Latest Answers