Why do computers heat up/need cooling

666 views

Why do computers heat up/need cooling

In: Engineering

2 Answers

Anonymous 0 Comments

At the basest level, computers are electricity moving through tiny little paths in controlled ways. When you move electricity through these tiny little pathways the transfer isn’t perfect, so a small amount of heat is generated as waste.

(Think of a old lightbulb as a good example, hot to the touch)

It you let these parts get too hot, the paths melt and deform. Computers parts are so small, if these pathways deform even the slightest bit they touch other metal parts and cause a short.

(Moving electricity from one circuit to another where it wasn’t expected) or in some cases the entire part starts to deform from the heat, bends, and disconnects itself from the socket.

Anonymous 0 Comments

The bits inside the computer that do the, well, computing? They depends on electrical current moving around in conductive (but not perfectly conductive) materials. Moving a current through some resistance generates heat. It might be minimal for any given computational operation, but even small computers perform *loads* of operations over time. Without some means of getting the heat elsewhere (since it would simply move into nearby computer parts/air in the case if left alone), the cumulative heat buildup in a computer case or in specific components can become so great that it damages said components.

So we devise ways of moving the hot air around, giving heat more efficient pathways away from the originating parts, etc.