Eli5: Why don’t computer games use all available RAM?

371 views

For example, there’s a racing game I play that is a 20GB total install and I have 32GB RAM. The game never uses more than 5GB RAM.

Why does it unload and reload between races? Wouldn’t it make sense to keep everything in RAM after it’s loaded it once?

In: 0

6 Answers

Anonymous 0 Comments

You’re mixing up 2+ things here. Space on a storage drive (probably some flavor of SSD on a modern machine) which is the 20 gig install, RAM which contains information used by actively running programs which the game apparently needs about 5 gigs of, and VRAM which is a lot like the normal RAM but physically located on your graphics card and mostly dedicated to loading up textures & other large files the GPU needs exclusive and lightning fast access to — high end cards these days will have 10+ gigs of it and a modern game with 4k textures can eat up a lot of it.

And like the other commenter describes, the main reason not to keep an entire program’s install in RAM on launch (because RAM gets wiped between reboots) is that any decent program should never *assume* it maxing out your resources is desireable, especially if doing so brings very little gain. Most people don’t have enough RAM to load up an entire game (certainly not most modern games), so those games can’t be designed around that general strategy. Some will let you tweak these settings (the chunk loading limit of Minecraft is a decent example) to better take advantage of specific hardware, but this is dependent on the overall design of the game and adds development work (you can’t load up 10% more race without getting creative in ways that might lead to more debugging effort).

You are viewing 1 out of 6 answers, click here to view all answers.