This is something I am having trouble wrapping my head around. Say for example PCEm. It can emulate up to Pentium II, however Pentium III is nearly impossible due to current hardware restraints. However, a Pentium III is 433mhz (if I remember correctly) and modern CPUs are well into 5ghz range. However, to accurately emulate a 433mhz you need x amount of CPU.
Why is that the case? If the CPU you’re using to perform the emulation is vastly more powerful?
I read it’s same even for the Super Nintendo, it ran 5mhz, and for accurate emulation you’d need 3Ghz (which is around today, but wind back a few years ago it would the the same question).
Hopefully it makes sense, I am still trying to understand emulation on a deeper level. Happy to have any links to any docs that answer this question as well.
In: Technology
If your modern computer could just do what the old ones you are emulating did then you wouldn’t need to emulate them. They had specific things they were built to do and it is much less efficient for your modern system to do those specific tasks. Therefore you need more clock cycles (time/steps) to do it.
Latest Answers