What’s the difference between a “32-bit” and “64-bit” game? How does this affect the engine and how much RAM it can use?

204 views

This hit me today as I was prepping some pasta. I’ve got a relatively beefy gaming rig that has 12 gigs of VRAM and 48 gigs of normal RAM. However, older games will still have a tendency to drop frames when a lot of stuff is happening at once, even with these standards. From what I’ve read this is because there’s something in the games, or their engines, or whatever that means they can’t use the full RAM capacity of my computer or draw from it as much as they need. I’ve even noticed this when configuring game settings for, as an example, *Total War: Rome II*, where even though it detects my graphics card it won’t draw on its full strength to get what it needs, always locking at around 3-4 gigs. By contrast, the more modern *Total War: Warhammer III* can use my rig’s full power, meaning I basically never drop frames when playing it.

Why is this? What inherently stops 32 bit games from using more VRAM?

In: 3

8 Answers

Anonymous 0 Comments

Imagine you live on a street and there’s a rule that states the address can’t be more than 2 characters. You could never have more than 99 homes on the street, and if somehow you could, you wouldn’t be able to give it an address, so how would anyone be able to send mail to those homes? But change the rule from 2 to 4, and suddenly you have the ability to have 9999 homes on the street.

The number of bits a system uses is this same kind of concept. If memory can only be addressed by 32 bits, there is a limit to how much memory can be accessed. Even if you have more physical memory than that limit, the software can’t get to it because it can’t message any space beyond the 32 bits. By increasing it to 64, your addresses are now able to be significantly larger, allowing access to more memory.

You are viewing 1 out of 8 answers, click here to view all answers.