This is something I am having trouble wrapping my head around. Say for example PCEm. It can emulate up to Pentium II, however Pentium III is nearly impossible due to current hardware restraints. However, a Pentium III is 433mhz (if I remember correctly) and modern CPUs are well into 5ghz range. However, to accurately emulate a 433mhz you need x amount of CPU.
Why is that the case? If the CPU you’re using to perform the emulation is vastly more powerful?
I read it’s same even for the Super Nintendo, it ran 5mhz, and for accurate emulation you’d need 3Ghz (which is around today, but wind back a few years ago it would the the same question).
Hopefully it makes sense, I am still trying to understand emulation on a deeper level. Happy to have any links to any docs that answer this question as well.
In: Technology
Imagine someone is writing a script in an old dead language like ancient Greek. The ancient Greek would be the original console, that’s what they speak and write.
However now it’s been like a thousand years and even though us English speakers can read/write that to a degree, it’s a bit harder for us because there’s nuances to the older language that aren’t easily compatible.
Emulators have an extra “layer of translation” so to speak to communicate with the Greeks.
Latest Answers