It always puzzled me why exactly video game console emulation is so resource-intensive. Despite the systems themselves often being about as powerful as a severely outdated (at the time of the system’s release) PC, their emulators always require ridiculously powerful PCs to be able to run the games at full speed and native resolution. Why exactly is it so resource-intensive? I’m not looking for an explanation along the lines of “it’s resource-intensive cause it takes a lot of power to emulate the whole environment” cause it’s basically like answering with “it requires a lot of power cause it requires a lot of power”. I want to know *exactly* why I need a 4-core, 4.5GHz Intel i7 10th gen to be able to run a 2007 PS3 game in 720p, 13 years after its release.
In: Technology
Latest Answers