why do Graphics drivers need an update when a new game comes out?

291 viewsOtherTechnology

Surely by optimising your graphics for that game you are unoptimising for every other?

In: Technology

7 Answers

Anonymous 0 Comments

You aren’t really optimizing your graphics card for the game specifically. It’s more like giving your graphics card a set of instructions on how to optimally run the game when it is open. So in that regard, you aren’t losing the instructions for other games, you’re just getting an additional list

Anonymous 0 Comments

It doesn’t really happen these days, but a lot of former bleeding edge technologies (SLI) didn’t have any useful test cases until a demanding new game came out. Other times, drivers would fix bugs that weren’t noticed until new games threatened to exposed the bugs to the public.

Most games now rely on a handful of well understood 3D engines. You don’t see things break as often as you did a decade or two ago.

Anonymous 0 Comments

They use game-specific profiles and workarounds in their drivers. They are only activated in the context of the particular game they are made for.

Anonymous 0 Comments

They generally don’t, in fact they almost never would need it. In fact the drivers that ship with your card are likely working perfectly for its whole lifespan, with games new or old. However, these comapnies have done two things 1) They have convinced consumers that updates are needed and relevant 2) They make often extremely minor optimization tweaks, generally focused on highly specific setups,, that they tout as being significant upgrades but in reality are rarely impactful in anyway except to the tiniest subset of users in niche cases. Graphics drivers are basically totally fantastic out of the box and not much changes on a specific piece of hardware, like ever.

Now you *should* keep your drivers up to date, but the gain for almost all updates is minor to insignificant over the lifespan of your card.

Anonymous 0 Comments

OS and the graphics APIs are very specifically designed to *separate* all of the software, including the game (running in user space “on top of” the operating system) from the hardware. This is an intentional and primary boundary that defines “what is an operating system”. This is how the OS can take responsibility for the stability of the system – it’s the *only* thing that has complete, unrestricted, hardware-level access.

But, games are running incredibly complex code on top of a very specific purpose built piece of hardware, so they need to be *as close as they can possibly be* to directly controlling the hardware. The more layers of translation that exist between some general purpose OS-provided API that “translates things safely” to a piece of hardware, the more loss in performance there is.

So, there does exist a specific chunk of code that sits “in between” and across the protected boundary layer of the operating system. That’s the driver. That’s, like, *the very definition* of an operating system driver. When you install a driver, you’re installing “a piece of software that is specifically telling the OS that it is responsible for managing the hardware”. This can be very dangerous and it’s why you absolutely need administrator permissions to, for lack of a better word, “hot patch in a new piece of the OS” that’s allowed this level of access.

(just ask CrowdStrike – it wasn’t a Windows error that took down so many computers, it was a *CrowdStrike driver error* – the chunk of software that promised Windows “I won’t mess things up if you give me extra access”… went and messed things up.)

So when a new game comes out, the driver update is like, “Sure, I know you *can*, with some performance loss, render this scene, but it’s not going to be the most efficent way”. And that could work just fine!

But the driver updates essentially say “We’re going to provide a Crysis-specific (or whatever game) flavor of the driver that grants safe, but much more optimized/performant, access to the GPU so you can run the code on it as fast as possible”.

So you can’t just grant unrestricted access to the hardware, and you could – but don’t want to – make all code use the safest, general purpose way of doing things, so patching in the flavor of optimizations for each game is the compromise.

Anonymous 0 Comments

Game companies get advanced previews of new technology the graphics card manufacturer is putting into their drivers. They’ll develop a game around these technologies knowing that support for them will show up in a driver update that ships around the same time as the game.

Anonymous 0 Comments

When a new game comes out, graphics drivers often need an update because the game might use new or complex graphics techniques that the current drivers don’t fully support or optimize for. Game developers often work closely with graphics card makers to fine tune how their game runs on the hardware.

Updating the driver ensures the game runs smoothly, using all the latest tech and fixes any bugs or glitches that might happen.

As for “unoptimizing” other games, the updates don’t mess with older games because drivers are designed to handle a wide variety of games and settings. They can improve specific game performance without breaking older ones. Think of it like adding a new tool to a toolbox-it doesn’t make the other tools less effective, just adds something extra!