How does a computer overheating make it perform worse actively?

200 views

I understand the increased degradation of parts over time, but what is the heat doing to make a computer perform worse in real time?

In: 13

8 Answers

Anonymous 0 Comments

2 things:

* Modern computers are programmed to slow themselves once overheated, to not reach critical levens that would damage it.

* Resistance increases with temperature, and the higher the resistance, the larger proportions of the energy is wasted on heat, *meaning the hotter your computer gets, the more heat it generates for the same tasks.* This leads to the previous point earlier.

Getting better cooling means the computer starts getting hotter later, so overall it can do more work under lower temperatures, meaning even less heat generated.

You are viewing 1 out of 8 answers, click here to view all answers.