Why does emulation require X amount of cpu to for accuracy?

265 viewsOtherTechnology

This is something I am having trouble wrapping my head around. Say for example PCEm. It can emulate up to Pentium II, however Pentium III is nearly impossible due to current hardware restraints. However, a Pentium III is 433mhz (if I remember correctly) and modern CPUs are well into 5ghz range. However, to accurately emulate a 433mhz you need x amount of CPU.

Why is that the case? If the CPU you’re using to perform the emulation is vastly more powerful?

I read it’s same even for the Super Nintendo, it ran 5mhz, and for accurate emulation you’d need 3Ghz (which is around today, but wind back a few years ago it would the the same question).

Hopefully it makes sense, I am still trying to understand emulation on a deeper level. Happy to have any links to any docs that answer this question as well.

In: Technology

10 Answers

Anonymous 0 Comments

Youre playing the game [Simon Says](https://spectrum.ieee.org/media-library/photo-of-the-box-of-an-electronic-game-that-says-simon-and-shows-two-hands-pressing-colorful-buttons-on-a-round-object.png?id=50539567&width=400&height=250) with 3 friends, and you’re partners with each other, each team has their own game hardware. One player on each team is tasked with shouting instructions to their teammate, and whoever can hit the buttons the quickest the most times in a row wins.

The only problem? The host team both are native English speakers. You, however, are playing with a foreign exchange student who does not know a single word of English, only Spanish. So, now you have to not just be able to communicate in a new language by looking up the new word every time (imagine you don’t have the short term memory to remember them, you only can remember English), but also have to try to do it as quick and accurate as the other team to even try to spit out the same pattern close to theirs in rhythm.

Now, multiply this problem by a few million times, and this is an oversimplified way of thinking of the issues with translating instructions in real time between two totally different architectures. Now, realize that most consoles since the 16 bit era have separate teams working to output video, sound, and then for the 32 bit era and later, sub teams which only handle things like the math required to figure out where the corners and edges of a 3D thing is, then how the surface of it should look from the angle you’re looking (a floating point unit). So you hop on over to the room with the “video” team, only to realize the way the room is physically organized is totally different, lets say the guy that’s really good at doing the math for the 3D object size calculates them as hypothetical square objects to save time, not triangles like you’re used to. And on top of them speaking a different language inside of just that room compared to what your video guys speak. So now, that’s a whole new set of issues to worry about when they pass stuff off to the other teams to work in concert.

If you managed to write a dictionary to speed lookup of the translations, that’ll speed things up, but it will never be quite be with the exact same timing as the native English speaking team. Well, so you decide to find dedicated translators for the teams, but now that’s doubled your team size, and requires a ton more overhead even still. Actually exponential compared to what you’d think, because you have to then have translators between the Audio/Video/sound teams now too.

Eventually, you iron out all of these things, but you realize that the original Simon Says teams that all speak the same language have only a few dozen team members and can whip out instructions smoothly without delay, meanwhile you have 1000x the total people they do to be able to relay things between rooms as quickly and accurately as you can. Finally, you match them a long time later after you’ve worked out all of the kinks.

You are viewing 1 out of 10 answers, click here to view all answers.