Why does emulation require X amount of cpu to for accuracy?

281 viewsOtherTechnology

This is something I am having trouble wrapping my head around. Say for example PCEm. It can emulate up to Pentium II, however Pentium III is nearly impossible due to current hardware restraints. However, a Pentium III is 433mhz (if I remember correctly) and modern CPUs are well into 5ghz range. However, to accurately emulate a 433mhz you need x amount of CPU.

Why is that the case? If the CPU you’re using to perform the emulation is vastly more powerful?

I read it’s same even for the Super Nintendo, it ran 5mhz, and for accurate emulation you’d need 3Ghz (which is around today, but wind back a few years ago it would the the same question).

Hopefully it makes sense, I am still trying to understand emulation on a deeper level. Happy to have any links to any docs that answer this question as well.

In: Technology

10 Answers

Anonymous 0 Comments

Emulation is hardware (your computer) running software (the emulator) pretending to be hardware (game console or whatever), which is running more software (the game or program you’re using emulation to run).

Even though the program you’re trying to run in the emulator might just be trying to run a single instruction, different hardware might require different instructions and a completely different order of running them, meaning any sort of translation between the real hardware and emulated hardware is going to require a lot more processing than just running that single instruction, and even then it’s going to run much slower due to it running as software and not native hardware.

You are viewing 1 out of 10 answers, click here to view all answers.