What’s the difference between a “32-bit” and “64-bit” game? How does this affect the engine and how much RAM it can use?

198 views

This hit me today as I was prepping some pasta. I’ve got a relatively beefy gaming rig that has 12 gigs of VRAM and 48 gigs of normal RAM. However, older games will still have a tendency to drop frames when a lot of stuff is happening at once, even with these standards. From what I’ve read this is because there’s something in the games, or their engines, or whatever that means they can’t use the full RAM capacity of my computer or draw from it as much as they need. I’ve even noticed this when configuring game settings for, as an example, *Total War: Rome II*, where even though it detects my graphics card it won’t draw on its full strength to get what it needs, always locking at around 3-4 gigs. By contrast, the more modern *Total War: Warhammer III* can use my rig’s full power, meaning I basically never drop frames when playing it.

Why is this? What inherently stops 32 bit games from using more VRAM?

In: 3

8 Answers

Anonymous 0 Comments

For the most part i means which CPU architecture it was compiled for. All new CPUs have been 64-bit for a good while now.

32-bit CPUs could only allocate memory addresses up to about 4~ billion unique numbers. This is why they can only use a maximum of 4GB, they can’t get addresses for more than that.

64-bit computers won’t be running out of addresses for memory for a long time, 64-bit can support over 16 Exabytes.

You are viewing 1 out of 8 answers, click here to view all answers.