What’s the difference between a “32-bit” and “64-bit” game? How does this affect the engine and how much RAM it can use?

214 views

This hit me today as I was prepping some pasta. I’ve got a relatively beefy gaming rig that has 12 gigs of VRAM and 48 gigs of normal RAM. However, older games will still have a tendency to drop frames when a lot of stuff is happening at once, even with these standards. From what I’ve read this is because there’s something in the games, or their engines, or whatever that means they can’t use the full RAM capacity of my computer or draw from it as much as they need. I’ve even noticed this when configuring game settings for, as an example, *Total War: Rome II*, where even though it detects my graphics card it won’t draw on its full strength to get what it needs, always locking at around 3-4 gigs. By contrast, the more modern *Total War: Warhammer III* can use my rig’s full power, meaning I basically never drop frames when playing it.

Why is this? What inherently stops 32 bit games from using more VRAM?

In: 3

8 Answers

Anonymous 0 Comments

Most devices are 64 bit anymore but not everything takes advantage of that. The primary benefit is that the software can make use of larger chunks of memory. However, 64-bit CPUs also have newer sets of instructions unavailable to 32-bit software so there are some advantages there too.

You are viewing 1 out of 8 answers, click here to view all answers.