What’s the difference between a “32-bit” and “64-bit” game? How does this affect the engine and how much RAM it can use?

202 views

This hit me today as I was prepping some pasta. I’ve got a relatively beefy gaming rig that has 12 gigs of VRAM and 48 gigs of normal RAM. However, older games will still have a tendency to drop frames when a lot of stuff is happening at once, even with these standards. From what I’ve read this is because there’s something in the games, or their engines, or whatever that means they can’t use the full RAM capacity of my computer or draw from it as much as they need. I’ve even noticed this when configuring game settings for, as an example, *Total War: Rome II*, where even though it detects my graphics card it won’t draw on its full strength to get what it needs, always locking at around 3-4 gigs. By contrast, the more modern *Total War: Warhammer III* can use my rig’s full power, meaning I basically never drop frames when playing it.

Why is this? What inherently stops 32 bit games from using more VRAM?

In: 3

8 Answers

Anonymous 0 Comments

Bit size of the application determines how much memory it can see (2^32 = 4 GB) but the way Windows manages memory a 32 bits app can see only 2 GB of memory. There are some way to force an app to see 3 GB.

Bit size also determines how much data it can process in a single command using specific CPU instructions like SIMD. 64 bit vs 32 bits won’t be two times faster but it will be better

You are viewing 1 out of 8 answers, click here to view all answers.