Why does the performance of a computer processor deteriorate when it overheats?

373 views

Why does the performance of a computer processor deteriorate when it overheats?

In: 8

10 Answers

Anonymous 0 Comments

It more or less melts internal components if it ends up too high. However, windows prevents this with an auto-shutdown to prevent this damage.

Anonymous 0 Comments

Because of thermal throttling. When it becomes too hot, it will slow itself down to cool off and not get destroyed.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

Modern CPUs have temperature sensors that detect when they’re getting too hot. In order to cool down, the CPU can ‘throttle’ (slow down) itself until the temperature stops climbing.

If you find your computer is slowing down a lot, especially when gaming, you might need to see if you can clean the dust from your case and fans, invest in a case with better airflow, or invest in a better cooling solution, like a better-than-stock fan, or an AIO (all in one) cooling system.

(An AIO is a self-contained, fan based, liquid cooling system. One part attaches to your CPU in place of a CPU fan, the other part screws to the case. Liquid moves through pipes attached to both sections moving heat from the CPU to the large heatsink and fans attached to the case, dispersing the heat.)

Anonymous 0 Comments

temperature control is needed to keep it under a certain threshold. Apart of the obvious cooling, the other way to stop the temperature increase is to throttle down the number of operations that are happening, hence artificially slowing down the processor to let it cool.

Then there’s an emergency shutdown past a certain temperature.

It doesn’t get slow by itself, it’s slow down by a protection system.

Anonymous 0 Comments

It’s not the heat itself, exactly. As long as the temperature isn’t high enough to cause physical damage to the processor, the processing speed isn’t affected. The problem is that if the cpu is getting hotter and hotter, and the cooling system can’t compensate (e.g. by spinning the fans harder), then at some point the only way to stop it from getting dangerously hot is to reduce how hard the cpu is working. And that’s exactly what happens: the cpu gets slowed down in order to prevent damage from overheating. This is called “throttling”.

Anonymous 0 Comments

It does not deteriorate with heat. The processor will be slowed down artificially as the temperature rises in order to prevent it from reaching temperatures that can damage or harm the hardware.

Most processor manufacturers consider running at 100-105 degrees Celsius as safe, even for long periods of time. Experiments show that severe hardware degradation only starts at temperatures above 120C. It makes sense to design the entire system (cooling, power control, performance control) so that the processor runs as close to 100C when operating at peak performance. If your cooling system is too weak the processor will heat up faster and might need to be slowed down, reducing performance. If your cooling system is too strong, you might be using additional power/weight/size to keep the processor colder for no additional benefit. This is especially an important consideration in laptops, where you want to find a right balance of size, weight, performance and battery life.

Anonymous 0 Comments

CPU-s of olde didn’t do that, deprived of cooling they just kept going until the smoke came out and they broke for good.

That’s an expensive sort of performance improvement, so modern CPU-s don’t do that anymore, if the temp goes up they automatically reduce the clock speeds as to not generate as much heat anymore and remain in safe temperature ranges. If that doesn’t help and temp keeps rising they eventually try to shut down entirely, they try to do that before the smoke is expected to escape.

Anonymous 0 Comments

The cpu has a thermometer that tells it how hot it is. If it hits a certain temperature, it lowers its own performance, to prevent actual overheating.

An interesting thing about this, is that most processors since 2007 (I think, I’m not entirely sure about the year) don’t actually process information as fast as they could. It’s possible you’ve heard of Moore’s law, which says (roughly) that the number of transistors we can put on a processor doubles about every two year, which also means that, in theory, computers should get twice as fast every two years.

While the first part actually holds true, the second part hasn’t been true for a while now. Or at least not just because we make smaller and smaller transistors. Because if we had them run as fast as they could, they wkuld literally melt. The performance increase we’ve been seeing in the past decade has been mostly due to clever tricks and parallelization.

People often talk about how there is a hard limit to computer speeds, since there is a hard lower limit on how small you can make a transistor. But that is still so far off, and we’ve hit another hard limit, the melting point of silicone, a long time ago.

Anonymous 0 Comments

This is one time where a car analogy makes perfect sense!

Imagine your car’s engine is getting too hot. You can either cut back on the accelerator to let it cool down, or you can ignore it and keep going the same speed until it overheats.

The former results in temporarily slower performance…and that’s about it. The latter results in great performance! Until the engine catches fire and stops working altogether, probably damaging the rest of the car, possibly killing you, and definitely costing thousands of dollars to fix.

Computer processors do the same thing, except the “deceleration” is automatic – specifically designed by the manufacturers to slow down processing speed or switch off entirely until it reaches a safe temperature. Because they assume nobody is stupid enough to think it’s a good idea to operate the processor at maximum speed until it melts and damages your other components and possibly burns your house down, and they don’t want to be sued for it.