what exactly is being “improved” when we see games graphically improving with time?


I understand that in a game world, we have different graphical features that come together to produce the visuals we see in a game, like lighting, textures, anti aliasing, etc. What I’m wondering is, when we see a company that comes out with the latest, visually mind blowing game in a series (lets take Uncharted as an example) what is involved in making the graphics nicer? What is setting, say Uncharted 4, apart from Uncharted 3? What’s happening under the hood that produces these improved graphics? Are the engineers at the company actually writing new equations or code to model reflections, water, shadows, and things differently? Is this just because hardware is becoming more powerful?

Sorry if the question is vaguely worded, it is tricky for me to nail given I am not of a technical background myself.

In: Technology

2 Answers

Anonymous 0 Comments

Digital Foundry is a great source for details on graphics technologies for video games. They do a lot of comparisons between gaming hardware as well as between different installments of series over time. For example, see the one on [Farcry](https://youtu.be/bPmnETNnGc0).

The tldr is that the improvements are a combination of different techniques for modeling physics, geometry, and lightning, and enhancements to the machines gamers have access to. These factors take turns limiting what a game can look like with few exceptions. One consistent exception is the notorious Crysis, which was designed to take advantage of yet to be developed graphics hardware, and can still be made to bring modern hardware to its knees more than a decade after its initial release.

Another trick that comes into play are ways to implement the same old techniques, but in ways that require less effort, freeing up precious resources on graphics hardware to do other work that can make games look even better. A recent example is a feature of new video cards called deep learning super sampling. This feature allows a game to be rendered at lower quality but **image enhanced** using an artificial intelligence trained on extremely high quality renders from the game…renders which no consumer video hardware could produce. By using this technique, a game could use the spare cycles to theoretically render more trees, or other terrain to present the illusion of a more complete world.

Finally, the actual substance of games has improved as the promise of returns on investments have driven up the production costs of games. For example, a recent game, Grand Theft Auto 5 was one of the most expensive games produced but ended up making more money than any entertainment product (including TV shows, movies, …) in it’s first 24h…and it keeps making money years after release. Returns like this encourage higher production values for future games. This translates to better voice and motion acting, higher quality modeling of real life architecture, people, materials, and terrain, and better story writing.

All of this is why games keep getting better and better. It’s part better tech. It’s part better hardware in gamers hands to allow past tech to be cranked up. It’s part getting more efficient. And the last bit is the fact that games make a lot of money and can therefore justify huge production values.

You are viewing 1 out of 2 answers, click here to view all answers.