Imagine a game like Flappy Bird.
You fly a bird and randomly generated obstacles keep flying towards you.
Now, information about those obstacles (for example, the size of the obstacle) needs to be stored somewhere on the computer.
As the game goes on, more and more obstacles appear, basically infinitely until you lose the game.
In a well programmed game, you would write a line of code instructing the computer to delete information about the obstacles that you flew by and are no longer visible on the screen because that information is no longer needed.
But, if you’re not such a good programmer, you might forget to delete the information about obstacles. Each time a new obstacle appears, information about it gets stored, but that information never gets deleted. So, the longer you play the game, the more information you have stored in the memory, it just piles up and eventually starts causing problems like lack of memory space.
Basically like a box that you constantly keep putting items in, but never take anything out. Eventually you run out of space and you’re in trouble.
Like all computers, consoles have a limited amount of memory. If they run out of memory then they can’t do anything else, and bad things happen. On older systems this could actually cause a total crash. In newer systems the operating system typically intervenes to free up memory in a hurry, but this typically means killing entire programs wholesale, which can cause problems in its own right.
To prevent this, programmers have to be careful about what the program remembers. You don’t want the computer to remember things it no longer needs, but you don’t want it to forget things it still needs to remember. Figuring this process out is called *memory management*. On early systems, programmers had to do all of the memory management themselves. More modern programming languages and game engines use a variety of techniques to take some of this burden off the programmer by doing it automatically themselves. But for games in particular, these techniques are often not enough, and the programmer still has to do some manual forms of memory management.
When memory management isn’t done correctly, a number of different things can happen, depending on the exact type of error. A *memory leak* is what happens when the computer habitually fails to forget things that it should be forgetting. That word “habitually” is key here. It is usually not a problem for the computer to remember one or two stray objects, but a leak happens when something is wrong in the whole management system, so there’s some whole category of objects that keep going into the system and never being forgotten. They pile up, either quickly or slowly, like water leaking into a bucket. It’s not a *huge* problem as long as the bucket doesn’t ever get full. But if it fills all the way up, Bad Things Happen, and memory leaks increase the risk that the bucket will fill up all the way.
*The Legend of Zelda: Breath of the Wild* makes some aspects of its memory management very visible to the end user, so I’ll use it as an example (Its sequel, *The Legend of Zelda: Tears of the Kingdom*, does this too). As an open-world game, it has to remember many things about the state of the world: what enemies you’ve killed, what ore veins you’ve mined, what treasure chests you’ve picked up, what trees you’ve chopped down, and so on. The game world is very large, so every once in a while the system needs to forget what has happened in order to keep memory from filling up completely. It does this on a predefined schedule, determined by how much in-game time passes in the overworld. When the schedule says to, the game forgets everything about the world, and the game signals to you that this has happened with an event called the Blood Moon. This helps keep memory usage low, while also minimizing the end-user’s surprise. You are told the world resets when the Blood Moon happens, but technically it’s the opposite: the Blood Moon happens because the world is being reset.
But that schedule, as it turns out, is not enough. There are a number of ways to force a lot of objects to be created very quickly, to fill up the system’s memory. There are also places where the timer doesn’t run, like when you’re indoors, and these can stop a Blood Moon from happening when it otherwise would, letting memory fill up more slowly. To stop this from becoming a problem, there is another check: if system memory usage ever goes over a certain amount, the world resets *right then*, no matter what the in-game time is, even if you’re in situations where the timer should not be running. This is also signaled to the user, in the same way scheduled resets are: with an immediate Blood Moon. Fans call this a *Panic Moon*, because the system is panicking and hitting a sort of emergency stop button.
When a program requires memory (RAM) to store resources, it requests it from the system and gets assigned a block of memory (an address range) reserved for its use. A well-designed program will report to the system when it is done with that memory, allowing the system to reassign it to a new function.
In the case of a “memory leak”, a program fails to release memory when it’s done. This unreleased and unusable memory builds up as the program repeats this function. As time goes by, more and more memory gets reserved. Eventually, there is no free memory left to assign, meaning that an application or the entire system slows down or eventually crashes.
Your kitchen is your computer. Your fridge and pantry are the hard drive, and the counter is your RAM/memory.
You make a sandwich and get out the bread, mayo, meat, and cheese. After you finish making it, you are supposed to put all those things away, but this time you forgot to put away the Mayo, and now it is on your counter taking up space. The next time you forgot to put away the meat….and so on.
Eventually, your counter(RAM/Memory) is all filled up with shit you forgot to put away, and you have no space to prepare your dinner.
It’s not specific to games, it happens to any software.
Imagine you have a dishwasher. You load it up with cutlery and run it, but you forget to take the clean cutlery out. Next, you put more dirty cutlery in, still not taking anything out. This cycle repeats for a while, but eventually you will run out of cutlery or out of space in the dishwasher – either is a finite resource, just like memory.
A great analogy is shopping carts at a grocery store. You come into the store, pick up a cart, use it, and when you’re done you put it into a corral, signifying that you are done. Periodically a worker will collect carts from the corrals and return them to the store.
Now imagine someone shows up, takes a cart, and doesn’t return it when they’re done. They’re on foot or something so they just walk back home with the cart and leave it on the street in front of their house when they’re done, so the cart worker never notices it as an available cart. They do this a couple of times, and nobody really notices. But if they’re doing this twice a week for a year, suddenly you’re missing 100 shopping carts and there aren’t enough left in circulation to go around and people start getting frustrated when they walk into your store and can’t find a cart.
Getting a shopping cart is memory allocation. Putting it in the corral is releasing allocated memory. Cart workers are “garbage collection” that recycles released memory into circulation. The person taking the carts and not returning them is a memory leak.
Now how does this actually happen? Many higher level languages have automatic garbage collection, so it’s rare to get a leak because allocated memory doesn’t get manually released. But lower level languages require explicit manual release to trigger garbage collection. In C, for instance, memory allocation is all very explicit and manual, you have to carve out the memory you want exactly when an object is instantiated, and you use a pointer variable to indicate the address in memory that you allocated. If you later point that pointer to a different address without de-allocating the initial object, then you no longer have a way to track where that allocated memory actually is in order to tell the garbage collector when you are done with it, and the memory manager will still hold that space indefinitely until the entire program is terminated. But even in higher level languages you can run into issues if you are repeatedly declaring new variables without ever letting go of your old ones. If you have, say, a map that counts every word someone types in chat, and you continually add new words to it as you see them, and never clear that map out or close the program, eventually it can get large enough that it’ll be consuming all available memory and the program will crash. This would be like if a customer walks into the store and keeps getting more and more carts, putting a few items in each, with the intent of buying them all, but never actually checks out.
Latest Answers