Why do computers need RAM memory? And why some programs require lots of it while others require less?

248 views

Just curious about it, but can’t understand why is it like that.

In: 7

15 Answers

Anonymous 0 Comments

Dont take the numbers i give as exact, they are just ballpark to help understand the timing.

Simply put, the closer data is to a CPU’s core the faster operations can be run. Data in registers are what a CPU actually operates on, this is very small like measured in bytes. When you need data loaded to a register time is wasted to get it there. So the solution was to make multiple levels where you could hold more and more data but take more and more time to access . After registers the next level is cache on the chip which is usually kilobytes and takes nanoseconds to get. After that you get to RAM which is gigabytes in size but it take a microsecond to access. After that is harddrive which takes milliseconds to access but holds terabytes. Importantly if power is shut off the only one designed to maintain stored data is the harddrive.

Different programs need different amounts of RAM because some things operate on gigs of data like a high definition video game. But others operate on way less, like a text editor.

Computer Sciences studies the complexities of doing the above stuff and ways to do it faster.

Anonymous 0 Comments

In general a computer has 3 types of parts: logic cores, storage, and the “bus” which handles moving data between the logic cores and the storage.

Computer storage can be sorted along 2 axies: Cost per bit and response speed.

A general rule is that as memory responds faster to requests for data, it gets more and more expensive.

Now, I am going to walk through an analogy of how the logic core in your CPU sees data in your system that should help with why the various levels of memory in computers exists:

You (as the logic core) are sitting at a table in a library. You have a book open in front of you, there are other books on your table and even more elsewhere in the library.

Now, the book you are holding is the data that the logic core is actively working on, the other books on the table are in a memory type called “registers” which are built into your CPU’s cores and can hold maybe a few kilobytes per core.

You might think that the rest of the library is RAM, but it’s actually something called “cache” which is built into your CPU and is pretty damn fast but maxes out on the order of megabytes.

Now what is RAM here? It’s another, larger library on another continent, which has to mail you any books you request via the post office.

The next question you might ask is “so, what about my hard drive or SSD?” well, that’s an even larger library on the moon, possibly even Mars if it’s a hard disk since that has even worse lag time; either way sending you a book is a laborious process that involves a spaceship and atmospheric-entry.

You can also get magnetic tape storage which in this analogy would be an absolutely massive library on Pluto.

The exact ratios between the timings here are off, but this is meant as an exaggeration to convey an idea.

As to why some programs need more than others, that’s fairly simple: different programs are designed to do different things, and different tasks take different resources to do. It’s kind of like trains, when you plan on having a train carry more cargo you add more cars to it so that there’s space for everything.

Fun fact: on moder motherboards the physical location of RAM is partially dictated by the speed of light because of the length of the copper traces between those chips and the CPU.

TL;DR: RAM in computers serves as a balance between capacity and speed so that computers can access the data they need as quickly as they need it, and programs need to be able to access various quantities of data to serve their functions which depends on what their designed to do.

Anonymous 0 Comments

Since I’ve seen many answers tackle RAM let me tackle needing more or less. One part comes down to how the problem is framed. Let’s say I ask for the multiples of all numbers between 1 and 16 quintillion. That’s easy you just count from 1, multiply each number by 12 along the way and spit out the answer.
Now I ask, I want the multiples of a *subset* of numbers between 1 and 16 quintillion. Although it’s less numbers that requires memory because I have to know which numbers you want.
I can take another approach and say we’ll just multiply all the numbers and I’ll pick out the ones I wanted from the results. This wastes processor time, but you’re back to not needing RAM. This is an example of the space-time problem. Sometimes you just have sacrifice one or the other for your results, and developers choose one or the other based on the environment they’re working in. I’ve worked with old code that had tricks to deal with low memory situations, if we were doing the same thing today it would be a more straightforward solution since RAM is in abundance compared to decades ago.

Anonymous 0 Comments

RAM is the thing that that CPU can directly access. When you run a program – a game, browser or other app, the instructions for that application are relatively small – maybe only a few megabytes in size. Most apps – games, browsers and so on, include many graphics, and/or access a lot of data (maybe a database, or retrieve data over the web). The data the program uses will dwarf the program.

For games, typically many gigabytes in size, all those animations and effects and backdrops – are individually stored. A person which is walking, may have hundreds of images to represent the walking motion, and the ‘game’ will just layer the right image to make up the display – like a glorified, old-style flip-book.

A web browser takes a small piece of information, like [http://google.com](http://google.com) and retrieves something from the website. In turn, that information has more links to other pieces of info – CSS/HTML/Javascript – so that the page looks good, has videos/images and adverts. This process repeats – many websites are composited from lots of different bits of information. The memory to store this can easily amount to hundreds of megabytes for one site. Much maybe shared, e.g. going to [newyorktimes.com](https://newyorktimes.com) and [reddit.com](https://reddit.com) may have bits which are similar. So the web browser stores this in a database, and ideally, the database needs to be in memory to make the application seem fast. (When a browser runs out of memory, the machine will now feel very sluggish).

Back in the days when everything was textual – a few MB of RAM was fine. Now we play huge games or open dozens/hundreds of tabs, the inefficiencies are piled up. Application or memory size isnt a good way to measure ‘goodness’ but it is annoying when apps grow and grow, and eventually your nice PC feels like it needs an upgrade.

Anonymous 0 Comments

Computers work with data. How much data is needed fully depends on how complex the work is. That’s basically all there is to it. Writing a few lines of text doesn’t take need much data, an Excel spreadsheet with a few thousand rows needs more data, and a video game with fancy 3D graphics and sound needs lots more data.

Data can be divided into two types: the kind that we need to store a lot of for a long time, but it’s ok if it’s a little slower to access; and data that we need to access really fast, but we don’t need much of it for very long. The former – persistent storage – is good for storing the programs themselves, as well as information we might need later (like documents, images, configuration). The latter – volatile storage aka RAM – is good for loading programs from disk when you need them, and storing all the working data (program state) that the program needs to run.

RAM is only temporary so it doesn’t matter if it gets wiped when the computer turns off. In fact it can even be beneficial if a program gets into a bad state, resetting memory can resolve transient issues. Any important data we need to keep is transferred to persistent storage. This relationship between RAM and storage is a fundamental architectural design of modern computers.

So TL;DR, more RAM is needed if your program is bigger, and/or if it uses large amounts of temporary data while it runs and does whatever it’s programmed to do.

And by massive coincidence of physics and electronic engineering, it turns out it’s easy to manufacturer small amounts of volatile memory that’s incredibly fast (RAM), or large amounts of persistent memory that’s an order of magnitude slower but still pretty fast, but pretty much impossible to achieve all three with a single device.