What is the difference between the CPU’s clock rate and the RAM’s clock rate?

319 viewsOtherTechnology

Hello.

I know that the clock determines how often can a CPU work on an instruction. Basically, for each tick, the CPU will either fetch an instruction, decode it, or execute it, correct? It’s basically responsible for synchronizing this entire process. The amount of clock cycles per second is measured in hertz and indicates a CPU’s performance, for example, 3 GHz.

However, I’ve just learned that RAM also has a clock rate, for example, 3000 MHz. What is the difference between the clock rate for the CPU and the clock rate for the RAM Doesn’t the CPU’s clock rate determine how often a CPU can work on an instruction (fetch-decode-execute)? And fetching is basically loading the data from RAM, so where does the RAM’s clock rate come into place, or what exactly does it do? If the processor is constantly fetching instructions from the RAM, why isn’t the RAM’s clock rate same as the CPU’s?

Thank you!

In: Technology

3 Answers

Anonymous 0 Comments

RAM has to operate on cycles too as all microchips do. A RAM stick will tell you how many cycles it takes to perform an action. For my 3200MHz RAM, it is 16-18-18-36. The first number is most important, with it saying how many cycles it takes to access the memory.

RAM operates on a different clock to the CPU as it would be too infeasible to keep it in time over such a large distance. At these speeds, light only travels 8 centimetres per clock cycle, so you have to really worry about the time it takes for signals to travel.

You are viewing 1 out of 3 answers, click here to view all answers.