: 65 degree vs 75 degree energy usage

614 viewsOther

TLDR: what dictates the amount of power consumed by heating/cooling utilities?

We all grew up knowing that touching the thermo was a no no. And that just slightly adjusting it away from a nominal temperature would raise the bill significantly. But why is this?

In my head I think of it like this. If a house is 80 degrees F, and you turn the ac down to 70. Then obviously a lot of power will be used to get the initial temp to 70. And it would make sense that if you kept turning the ac off and letting your house go back up to 80 and then turning it on again, then you would use considerably more power. Due to the large difference being displaced each time.

But let’s say the temp of the house was 71, you turn the temp to 70. It’s only gotta cool 1 degree. No big deal right?

Okay well if you keep your ac/heat set to a certain temp – then your ac kicks on every time the temp increases/ decreases past the set temp. Let say for conversation it kicks on after 1 degree difference.

If that’s the case then why is it so much harder to keep a house cool at 65 vs 75 in the summer, or 75 vs 65 in the winter. If the temp only moved 1 degree before kicking on then like what’s the difference?

In: Other

5 Answers

Anonymous 0 Comments

Because the thermostat does basically what you describe.

You can heat from 60 to 70 degrees, turn it off, let it cool down to 60 and repeat. Or you keep the thermostat at 70, it shuts off at 70, let’s it cool down to say 68, turns the heating on again etc.

It doesn’t really matter if you heat cycle 10 times from 68 to 70, or 2 times from 60 to 70 in the same period. You need about the same amount of energy to counter the heat loss to the outside temperature. Regardless if you heat longer twice or shorter 10 times.

You are viewing 1 out of 5 answers, click here to view all answers.