Modern PCs have build in fail safes so that a processor does not cook itself. Once a certain temperature is reached the cpu slows down in order to generate less heat. The more heat the slower it will run. If this feature was not in place a failed fan or heavy load could damage the processor costing a lot to replace.
Heat wouldn’t slow it down for as long as there is sufficient cooling capacity. If the temperature rises past a certain point, then a protection mechanism in the computer processor kicks in, which reduces the clock frequency and power dissipation which is proportional to the frequency and voltage. Now the processor can do fewer calculations.
Largely so that more heat isn’t created.
When a computer is going as fast as it can, it continually creates more and more heat. Unfortunately for it’s health, a computer has important parts made out of thin pieces of metal and plastic. With too much heat, these parts melt or break. So modern computers have a thermometer connected to their most likely to overheat parts (GPU and CPU) and most operating systems will force the computer to go slower when it gets too hot.
If these failsafes aren’t working, the computer is free to go as fast as possible in whatever heat. But then those important parts break, and because the way they break makes the computer uncertain about what parts are breaking (just that it’s getting errors), it starts to take more time to do normal things because part of its time is taken up getting errors. If the computer is set up without circuit breakers and manages to keep running, it will eventually catch on fire or melt. Most experts agree that turning an expensive computation machine into a common smoke machine is bad.
The max frequency a CMOS chip can operate at decreases slightly with increasing temperature, but that is not why computers slow down.
The explanation is that there is a max temperature the chip can operate at if it gets to warm it can physically get damaged by the heat. There is a temperature sensor that measures the temperature and reduces the chip power usage if it gets too warm. If the clock frequency is reduced the chip do less per unit of time and less heat is produced. That is how power usage is reduced. That the chip does less per unit of time also mean the computer is slower.
It is not that different that if your work hard and yet to warm you need to slow down the work so you do not get overheated
I believe u/Jason_Peterson has narrowed this down pretty well, but if I may add in this can also boil down to how electricity works. Heat can slow down current, or electricity, flowing through a computer. This is known as Resistance. If you start up a computer, current continuously flows through each part, this over time generates heat, which turns into resistance from the heat generated. If electricity is slowed down in those parts, so is your computer. Basically, electronics and electricity as a whole naturally generate heat which turns into resistance which slows down electricity flow, which in turn slows down your computer.
Hope this helped!
Higher frequencies on a processor means less stability.
More power, for a stronger signal, means more stability but also more heat.
So, it’s not that heat slows down a computer, it’s that the frequency must drop to maintain stability when we push less power through it to lower the heat. We could decide to maintain the power and frequency, and that would produce more heat, which is why better cooling allows for higher frequencies.
Latest Answers