eli5: For games that are in development for long stretches of time (like cyberpunk – 8yrs), how do developers deal with advancing technology?



Or are games like Cyberpunk or Red Dead just made with early-mid 2010’s tech?

Edit: to clarify, development didn’t actually start 8 years ago, but rather the game was announced 8 years ago. Thanks to the commenters for letting me know!

In: Technology

They just keep updating the graphics and other bits to match.
Which is why cyberpunk doesn’t run properly on the older consoles despite having been originally intended for them.

Keep Updating the code as time goes on.
Not a good strategy but they have to make do not everyone has funds or people like Tencent.

At this point, technology doesn’t actually change very much. The industry has seen enormous changes, but currently nothing like that is happening.

Take for instance that in the times of NES consoles and such everything was sprite based. Everything had to be made from little 2D blocks of a fixed size. When 3D was introduced all of that had to be completely thrown out — the way you do the graphics is different, the way the world works is different, the way controls work are different. So definitely, at times like that, it was hard to keep up. If you spent years polishing up your old school 2D game, you might have found that by release day 3D had exploded and your stuff now looked old, and redoing everything for 3D would be impossible without starting from scratch.

But we seem to be mostly done with this kind of thing. The most similar thing to that going on today is VR, which while not that different from 3D games does have its own peculiarities and requirements that can be challenging to adapt to. But it’s still much less jarring than the shift to 3D was. Also, console hardware is far less “special” — modern consoles are pretty much standard PCs, rather than the extremely specialized devices they used to be with various weird and very manufacturer specific hardware.

Other than VR, which as far as I know Cyberpunk doesn’t use, we’re mostly adding incremental improvements: faster CPUs, faster video cards, faster storage, plus optional features like raytracing. Those are easy to adapt to. Make bigger textures, more complex assets, fill the world with more people, etc.

Cyberpunk was into pre-production stages until mid/late 2016, when CDPR finished with the last bit of content for TW3. Active development started there.

Very generally, badly. It is pretty inconceivable for a game to be in planned development for 8 years – partly for the reason you mentioned, the technology keeps improving and there is no visibility that far in the future.

A designer might have a game concept stored in their mind or some file for years waiting for hardware/software to mature to where it can be realized but actual development more than 3-4 years is very unusual.

It is nearly a certainty that anything with an 8 year development timeframe ran into very very very serious problems.

Its like if you have a forest and an axe but new tech is out and now you know you can spend a few days making a chainsaw.

Do you take the time to make a chainsaw? Would it help cut down more trees than the axe? Even though it cuts twice as fast it also breaks if not cared for.

Now i have to conduct maintenance for a few days a week to keep it working.

in the end i work less hours cutting and cut less trees. I spend more hours working on the thing that cuts trees now.

Sometimes it just better to keep using the axe. So it really depends on if you need trees today or a bunch tomorrow.

Same with code. do we need to update all this or will it work as is? Some things must be updated. Some things no. worker gloves and eye protection are great updates even for axemen. but the chainsaw might just be a pretty luxury.

The game Kenshi has a really interesting development cycle that would be relevant to your question. iirc a small team made the game over a ten year period and the engine they used is as such very dated. Somewhere there is an article about that game and your question. The devs had to push the limits of the engine they had etc…

great game but its interface was very old style.

havent played cyberpunk yet as im waiting on the dust to settle.

chainsaws are legit better than axe though.

In software development we use usually use a library or framework to develop our code, these export their functionalities, each functionality has a name and input and output, for example you could use some library to do a calculation where you would pass 2 numbers as input and receive one number as result. Now let’s imagine newer computers can do the same calculation twice as fast by using some advanced hardware processing or whatever, for our code it will be the same thing, all we need to know is that our system receives the result after sending two numbers as input.
So you can even develop to something that does not exists as long as your target does not change the functionalities that it is being exposed. So I think that in terms of tchnology advancing for developers should be more in terms of configuration of the libraries they use and occasionaly some tweak over something that changed or got removed.

One thing to note, as someone mentioned, it wasn’t actively being developed, software wise, for eight years.

Any program, but games especially, start with pre-production, which is story boarding, deciding the market you want to push to, etc.

After that it’s completely possible for a game to sit, not being worked on at all for years (this generally will depend on how big the studio is/if the team tasked with it is busy)

After that it’ll go into active development, and be there for anywhere from 2-5 years (could be longer, but that’s a generalization). As for during this time, other people have answered it more in depth

Tl;dr it wasn’t in development for 8 years, more like 4 (if what another redditor said is correct)

If you listen to old school programmers a common theme you hear is “this would have been so much easier to do today.”. The advancing helps more than it hinders by removing constraints and bottlenecks.

Cyberpunk wasn’t in development for 8 years. It was maybe announced 8 or so years ago but development didn’t actually start until after The Witcher 3 blood and wine DLC so the game was only in development for the better part of 4 years, but to answer your question: Compared to the leap in changes from the early 2000s to now, the leap from say 2012-now is very small in terms of any new groundbreaking technologies. Apart from perhaps VR and Nvidias RTX technology there are very little significant changes been made in the past 8 years, sure lighting, shadows, textures, AI, environments are all better now because modern hardware can handle it but overall not a lot has changed in the past decade. Anything new or different that has to be implemented will be implemented via the game code or engine tools as development continues and most changes are of little significance that game code or tools can be updated as needed and do not pose any real hindrance to the development of the game. There are so many people working on AAA titles that there are always people on hand to make any changes that need to be implemented so development can continue as normal and not cause any delays.

i dunno how it works in game development but in industrial production series you have a preset of requirments it needs to get

so i guess you plan beforehand on which system you want to release so you have the lowest bar you need to hit but yea they should have kicked out xbone and ps4 and refund these preorders in the worst case via credit

Maintaining any piece of software over a longer period of time can be like owning a car. Imagine if your car was all one piece, and any time something went wrong, you had to replace your entire vehicle. It would quickly be unsustainable.

Instead, car manufacturers have learned they can split the car into many pieces (engine, battery, tires, etc.). If one of the parts required fixing or becomes out of date, it is hopefully simple and less expensive to replace it with the latest version of that part.

From experience of working in it, they will either tackle it by building on an engine that is extensible and can grow, worry about that problem when it happens, build on an engine where assets are easily portable to newer versions or in extreme cases entirely start over. I’ve personally seen the last one happen.

In many cases a framework like Unity or Unreal is used. In general the frameworks keep up with the changes so the game developers don’t need to focus as much on that. They may need to implement new features to take advantage of some of the advancements, though.

In my experience, a game our team was working on had to be either scrapped or rebuilt from the ground up when a major version of Unity came out. We had already spent a couple years on it and it wasn’t completely ready. Every day we fixed 2 bugs, 5 more showed up.

It seems like everyone working on Cyberpunk knew it wouldn’t be ready but they had a deadline and it needed to be met.

To answer your question, what developers do is they start over with new software or scrap the project if they can.

Graphics are usually one of the last things done. Look at early builds of modern games and they’ll look like something from the PS1 era.

In general games are done ahead of their time and when all the content is done it’s optimized to work on current generation of technology. For example almost all 3d models in the game are done in very high quality and then details are lowered and/or faked for version used in the game. If during the development there is a significant leap in technology you only need to make new version of in-game model not totally new model.

Like others have said CP2077 was probably done on in 4-5 years and before that it was mostly on paper. It’s worth mentioning that sometimes when game takes more than 3-4 years it means that initial stuff went to trash or at least most of it. So you can say that game was in production for 8 years, and it’s somewhat true, but the thing you are playing was actually made in 3-4.

I actually asked my friend who works in game dev this same question the other day! She told me that for a big-budget AAA game like Cyberpunk that’s developed on a proprietary engine, the engine and they code they write for it are specifically designed in such a way that they can continue to do rolling updates throughout the development process without needing to re-write too much code. Game devs also get the specs for new hardware before it’s announced to the public, so the Cyberpunk devs were able to build their engine for the PS5 before the PS5 even officially “existed.”

She also told me that for smaller or indie games that are developed on non-proprietary engines like Unity, it’s more common that the game *is* technologically behind the times, just like you said here; but apparently it’s not too difficult to update your code to keep up with rolling tech updates if that’s something that’s important to you, so some studios do put in the extra effort to keep their games as up to date as possible throughout development.

Interfaces/APIs or abstraction layers.

Everything talks to the engine via an interface or api. A layer between what you need for the game: textures, movement, physics calculations etc

You want to draw a circle so you call drawCircle() method in your code. Now you don’t care how the engine draws the circle but even more importantly all the code that the team uses is written so that they don’t have to.

The layer that actually does the drawing of the circle can be totally rewritten if need be but everyone else who uses it wouldn’t even know. Just update their dependencies.

Of course there are the bugs you have to deal with and the occasional deprecated call but other than that you start and finish doing the same work.