how does a cpu convert virtually 100% of energy to heat when it uses energy to do calculations?

1.50K views

I’m confused. I did some research online, and learned that cpu’s can essentially double has a hotplate, because 99-100% of electricity consumed is turned into heat. how? doesn’t the cpu use energy to make calcuations and render things? I’m real confused.

In: 1619

38 Answers

Anonymous 0 Comments

I had asked a similar question in r/askscience a while ago, you can read the replies here https://www.reddit.com/r/askscience/comments/rbazbz/how_much_of_the_electricity_that_a_computer_uses/

You’re right there is a very, very small amount of energy used in the computation of the information, called [Landauer’s principle](https://en.wikipedia.org/wiki/Landauer%27s_principle). I’ll use the post from u/bencbartlett to further explain it

> You’re correct, there is a fundamental energy cost called the Landauer limit associated with (irreversibly) computing one bit of information. The energy cost is due to the change in entropy associated with erasing one bit. The energy cost (k_B T log 2) scales linearly with the ambient temperature, and at room temperature is about 3×10-21 J.

> As a concrete example, the Core i7 processor in my MacBook Pro runs at 3 GHz with 16 double precision FLOPS per cycle and computes about 2×1013 bits per second, which is about 60 nW at room temperature. The processor uses 37W, so at most 0.00000002% of the energy is used to compute the information.

> Other than energy converted to light / radio / sound /airflow, all of the other energy of a computer is transferred into heat, and even these categories eventually end up as heat anyway. Your computer is effectively exactly as efficient as converting energy into heat as a standard furnace.

You are viewing 1 out of 38 answers, click here to view all answers.