What is the relation between semiconductor’s temperature, resistance and computer performance?

67 views

Because IIRC semiconductor’s resistance drops when it temperature rises. Why is computer(s of any kind) then slowing down istead of speeding up, specifically?

In: 1

2 Answers

Anonymous 0 Comments

You are remembering a bit wrong. The resistance of semiconductors increases with temperature up to a certain temperature and then drops. So in normal operating temperatures an increase in the temperature can mean that there is not enough current going through the semiconductors to make it to the end and you end up getting wrong results in the calculations. This usually cause the computer to crash. To prevent this the processor will do things like sleep for a bit to lower the temperature or lower the current but then lower the clock frequency to make sure the current have time to ramp up through all the semiconductors.

But in the extreme case where the temperature does get to where the resistance starts decreasing there will be more current going through the hotter regions of the processor. And even with lower resistance more current will cause more heating. So you end up with a thermal runaway effect that only stops once the semiconductor material melts and the power supply cuts out on overload.

Anonymous 0 Comments

Computers don’t just become slower when they get hot. They are deliberately slowing themselves down so they don’t overheat to a point where the electronics would get damaged.