I’ve always knew that consoles game ran better due to optimisation, but I don’t exactly understand what that entails. When a game is too difficult for my pc to run, I would just remove stuff, even stuff the devs don’t let me normally remove. But Devs can get a hardware do way more without sacrificing graphical fidelity.
God of war comes to mind. The ps4 has a gpu equivalent to a 750ti. Yet it looks far better than my 1050ti, especially since the 1050ti system can only stay 30fps after applying fsr. So, how do they do that? How come the codes that is used to optimised for PlayStation, doesn’t get carried over for my 1050ti, even though console uses x86 SoCs now.
In: 0
Latest Answers