What is the difference between the CPU’s clock rate and the RAM’s clock rate?

425 viewsOtherTechnology

Hello.

I know that the clock determines how often can a CPU work on an instruction. Basically, for each tick, the CPU will either fetch an instruction, decode it, or execute it, correct? It’s basically responsible for synchronizing this entire process. The amount of clock cycles per second is measured in hertz and indicates a CPU’s performance, for example, 3 GHz.

However, I’ve just learned that RAM also has a clock rate, for example, 3000 MHz. What is the difference between the clock rate for the CPU and the clock rate for the RAM Doesn’t the CPU’s clock rate determine how often a CPU can work on an instruction (fetch-decode-execute)? And fetching is basically loading the data from RAM, so where does the RAM’s clock rate come into place, or what exactly does it do? If the processor is constantly fetching instructions from the RAM, why isn’t the RAM’s clock rate same as the CPU’s?

Thank you!

In: Technology

3 Answers

Anonymous 0 Comments

Adding on to the other good comments from u/KillerOfSouls665 and u/oneplusetoipi , when the CPU fetches an instruction from memory and loads it to a register (the storage it actually does computation on), it’s usually fetching from the cache memory…this is specialized high-speed memory on the same chip as the CPU and it *does* run at the same clock speed as the CPU.

Other CPU circuitry moves data back and forth between the RAM and cache as needed so the cache memory is very fast (but small) and the RAM is (relatively) slower but much (much much much) larger. This allows each component to do what it does best.

Designing RAM to run at CPU clock speed would be ludicrously expensive and technically impossible in a lot of cases due to the distances involved. Running CPU at RAM speed would unecessarily slow the CPU. Cache memory buffers between the two to allow each to run at their most suitable speed.

You are viewing 1 out of 3 answers, click here to view all answers.