Why does the performance of a computer processor deteriorate when it overheats?

379 views

Why does the performance of a computer processor deteriorate when it overheats?

In: 8

10 Answers

Anonymous 0 Comments

The cpu has a thermometer that tells it how hot it is. If it hits a certain temperature, it lowers its own performance, to prevent actual overheating.

An interesting thing about this, is that most processors since 2007 (I think, I’m not entirely sure about the year) don’t actually process information as fast as they could. It’s possible you’ve heard of Moore’s law, which says (roughly) that the number of transistors we can put on a processor doubles about every two year, which also means that, in theory, computers should get twice as fast every two years.

While the first part actually holds true, the second part hasn’t been true for a while now. Or at least not just because we make smaller and smaller transistors. Because if we had them run as fast as they could, they wkuld literally melt. The performance increase we’ve been seeing in the past decade has been mostly due to clever tricks and parallelization.

People often talk about how there is a hard limit to computer speeds, since there is a hard lower limit on how small you can make a transistor. But that is still so far off, and we’ve hit another hard limit, the melting point of silicone, a long time ago.

You are viewing 1 out of 10 answers, click here to view all answers.