how does a cpu convert virtually 100% of energy to heat when it uses energy to do calculations?

1.13K views

I’m confused. I did some research online, and learned that cpu’s can essentially double has a hotplate, because 99-100% of electricity consumed is turned into heat. how? doesn’t the cpu use energy to make calcuations and render things? I’m real confused.

In: 1619

38 Answers

Anonymous 0 Comments

CPU is a bunch of switches. When a switch is open, tiny current. Instant power, product of instant current and instant voltage, is low. Likewise when switch is closed voltage is small, again low instant power. The fun part is during the switching transition. That takes finite time, during which instant power spikes.

CPU has a bazillion switches, toggling a grunch per unit time. This makes the CPU toasty while it does it’s job flipping bits. Also why overclocking makes it hotter.

Anonymous 0 Comments

Energy is a conserved quantity, it is not created or lost, only transformed. Given an electrical appliance, electricity is consumed, what types of transformed energy can come out? Heat of course. Light, mechanical movements, objects can be charged, magnet fields can be induced. Calculations are not a form of energy, those don’t count. So heat is really all that electricity becomes in a cpu. And ultimately all other forms of energy become heat too. For example a fan moves air so you would think mechanical motion is somehow different, but it isn’t, sooner or later that motion of air stops and all its kinetic energy has been transformed to heat.

Anonymous 0 Comments

Everything turns into heat in the end. Energy cannot be created nor destroyed. It can only be converted.

What is a calculation of a CPU? It’s just some electricity moving around a wire. If your CPU hadn’t had any resistance you could do calculations at almost no electricity cost. But the reason why your CPU “consumes” electricity is that the wires have resistance. Meaning they heat up when electricity passes through.

Anonymous 0 Comments

These aren’t really ELI5 answers.

Imagine cars on one of those super highways in China with 20 lanes. Now imagine all the cars are in a traffic jam and can go 0-60 in 1 second.

Now imagine, the road is split into 1 mile segments, alternating segments full of cars and empty segments.

Let’s say once the segment in front of the cars is empty, all the cars in that group are allowed to go forward/ accelerate at full speed until they reach the next segment of cars clearing out, and regroup themselves.

That’s going to be a lot of burnt rubber (tires producing a ton of heat to go as fast as possible from limited movement), when they have to slow down that’s a ton of heat from all the breaks so the cars so they don’t hit the ones infront. Plus the engines.

Now scale that down to an about the width of your hair.

Thats a lot of heat!

Anonymous 0 Comments

Where else would the energy go?

Energy can’t be created or destroyed. Or consumed. It can just be converted.

It could go out as radio waves, but that would create huge electromagnetic interference, with a 100w transmitter.

Turning energy into matter is possible, but much harder to do.

So the only thing it can become, is heat.

If you heat your house with electricity, then you might as well heat it with servers instead of radiators. Basically everything is 100% effective at turning electricity into heat, because heat is pretty much the definition of waste.

Anonymous 0 Comments

For the little transistors, “being a 1 or 0” doesn’t cost energy, “becoming a 1 or 0 after having been a 0 or 1” does. Think of it this way: the 1 and 0 state are like a ball lying on either side of a hill. It is perfectly content to lie around there. To make 1 from a 0, you need to push the ball up the hill, and when it rolls down the other side, it needs to stop and stay in its new place. Its energy is now the same as before. All the energy you put in must have become heat through friction with the ground to stop the ball when it rolled down the hill. This is not exactly the same as a 1 and a 0 for a CPU, but it shows that you can change things around, and all the energy you put in just becomes heat.

Anonymous 0 Comments

Energy can’t be “used up” entirely it can only be converted into a different, usually less useful form. In a CPU electrical energy is converted into waste heat.

Anonymous 0 Comments

I had asked a similar question in r/askscience a while ago, you can read the replies here https://www.reddit.com/r/askscience/comments/rbazbz/how_much_of_the_electricity_that_a_computer_uses/

You’re right there is a very, very small amount of energy used in the computation of the information, called [Landauer’s principle](https://en.wikipedia.org/wiki/Landauer%27s_principle). I’ll use the post from u/bencbartlett to further explain it

> You’re correct, there is a fundamental energy cost called the Landauer limit associated with (irreversibly) computing one bit of information. The energy cost is due to the change in entropy associated with erasing one bit. The energy cost (k_B T log 2) scales linearly with the ambient temperature, and at room temperature is about 3×10-21 J.

> As a concrete example, the Core i7 processor in my MacBook Pro runs at 3 GHz with 16 double precision FLOPS per cycle and computes about 2×1013 bits per second, which is about 60 nW at room temperature. The processor uses 37W, so at most 0.00000002% of the energy is used to compute the information.

> Other than energy converted to light / radio / sound /airflow, all of the other energy of a computer is transferred into heat, and even these categories eventually end up as heat anyway. Your computer is effectively exactly as efficient as converting energy into heat as a standard furnace.