: 65 degree vs 75 degree energy usage

544 viewsOther

TLDR: what dictates the amount of power consumed by heating/cooling utilities?

We all grew up knowing that touching the thermo was a no no. And that just slightly adjusting it away from a nominal temperature would raise the bill significantly. But why is this?

In my head I think of it like this. If a house is 80 degrees F, and you turn the ac down to 70. Then obviously a lot of power will be used to get the initial temp to 70. And it would make sense that if you kept turning the ac off and letting your house go back up to 80 and then turning it on again, then you would use considerably more power. Due to the large difference being displaced each time.

But let’s say the temp of the house was 71, you turn the temp to 70. It’s only gotta cool 1 degree. No big deal right?

Okay well if you keep your ac/heat set to a certain temp – then your ac kicks on every time the temp increases/ decreases past the set temp. Let say for conversation it kicks on after 1 degree difference.

If that’s the case then why is it so much harder to keep a house cool at 65 vs 75 in the summer, or 75 vs 65 in the winter. If the temp only moved 1 degree before kicking on then like what’s the difference?

In: Other

5 Answers

Anonymous 0 Comments

“Getting the initial temperature” is a shortcut to getting confused. There is a way to approach this problem that avoids it. The heat that enters or leaves the building to the outside depends only on the temperature difference. If the building is closer to the outdoor temperature, less heat moves. On a cold day: The energy saved letting it cool and the energy spent heating back up cancel out, while the energy saved *while it is cool* is indeed energy saved.

Anonymous 0 Comments

Basically, the farther your homes temperature is from the outside the harder the outside is going to work to make sure that isn’t the case.

This relationship is based on the difference in temperature and it basically completely linear.

So for example if you house is at 70 degrees, and it’s 60 degrees out. That’s a 10 degree difference you need to maintain.

But if you increase your house’s temp to 71 degrees now you have a 11 degree difference and basically have to work 10% harder and use 10% more energy to keep your house at 71.

Using your example of ” If the temp only moved 1 degree before kicking on then like what’s the difference?” the difference is how long before your AC/heat needs to kick in again.

For example, I live in a cold climate and on *really* cold days my furnace is basically constantly on just fighting to maintain the current temp. But on days that are less cold it might run for an hour or so total throughout the day.

Anonymous 0 Comments

The main thing you have to understand is that heat loss or heat gain is dependant on temperature difference.

If the house is 80 without any heating or cooling, and if you want to cool it down to 70, then it will slowly rise in temperature up to 80 again, the closer you are to the rest temperature, the slower the heat gain is and thus the less you need to cool down.

So if you house has a resting temperature of 80f, then it’s going to be more expensive to cool it to 60 than 70. But if you constantly recool down to 70 and let it heat up, then you the average temperature is roughly 75, and it will cost energy as if you were just cooling it to 75(roughly, it’s a bit more complicated than that).

So why is it harder to keep the house cool in the summer? Because the resting temperature of the house is much higher, which means the house heats up more over time, meaning the AC needs to turn on more often to keep the temperature down.

Anonymous 0 Comments

Because at 65 degrees the heat is trying to get in your house much faster than at 75 degrees. You’re comparing the wrong numbers, its not 65 and 66 vs 75 and 76 degrees, its 65 and 75 vs the outside temperature thats trying to heat your house up. Heat exchange is based on a difference of potential, the bigger the difference in temperature the faster heat will transfer. At 65 your ac has to fight harder against the high outside temp than at 75 and will run more often

Anonymous 0 Comments

Because the thermostat does basically what you describe.

You can heat from 60 to 70 degrees, turn it off, let it cool down to 60 and repeat. Or you keep the thermostat at 70, it shuts off at 70, let’s it cool down to say 68, turns the heating on again etc.

It doesn’t really matter if you heat cycle 10 times from 68 to 70, or 2 times from 60 to 70 in the same period. You need about the same amount of energy to counter the heat loss to the outside temperature. Regardless if you heat longer twice or shorter 10 times.