Why do computers need RAM memory? And why some programs require lots of it while others require less?

182 views

Just curious about it, but can’t understand why is it like that.

In: 7

15 Answers

Anonymous 0 Comments

Computers work with data. How much data is needed fully depends on how complex the work is. That’s basically all there is to it. Writing a few lines of text doesn’t take need much data, an Excel spreadsheet with a few thousand rows needs more data, and a video game with fancy 3D graphics and sound needs lots more data.

Data can be divided into two types: the kind that we need to store a lot of for a long time, but it’s ok if it’s a little slower to access; and data that we need to access really fast, but we don’t need much of it for very long. The former – persistent storage – is good for storing the programs themselves, as well as information we might need later (like documents, images, configuration). The latter – volatile storage aka RAM – is good for loading programs from disk when you need them, and storing all the working data (program state) that the program needs to run.

RAM is only temporary so it doesn’t matter if it gets wiped when the computer turns off. In fact it can even be beneficial if a program gets into a bad state, resetting memory can resolve transient issues. Any important data we need to keep is transferred to persistent storage. This relationship between RAM and storage is a fundamental architectural design of modern computers.

So TL;DR, more RAM is needed if your program is bigger, and/or if it uses large amounts of temporary data while it runs and does whatever it’s programmed to do.

And by massive coincidence of physics and electronic engineering, it turns out it’s easy to manufacturer small amounts of volatile memory that’s incredibly fast (RAM), or large amounts of persistent memory that’s an order of magnitude slower but still pretty fast, but pretty much impossible to achieve all three with a single device.

Anonymous 0 Comments

RAM is the thing that that CPU can directly access. When you run a program – a game, browser or other app, the instructions for that application are relatively small – maybe only a few megabytes in size. Most apps – games, browsers and so on, include many graphics, and/or access a lot of data (maybe a database, or retrieve data over the web). The data the program uses will dwarf the program.

For games, typically many gigabytes in size, all those animations and effects and backdrops – are individually stored. A person which is walking, may have hundreds of images to represent the walking motion, and the ‘game’ will just layer the right image to make up the display – like a glorified, old-style flip-book.

A web browser takes a small piece of information, like [http://google.com](http://google.com) and retrieves something from the website. In turn, that information has more links to other pieces of info – CSS/HTML/Javascript – so that the page looks good, has videos/images and adverts. This process repeats – many websites are composited from lots of different bits of information. The memory to store this can easily amount to hundreds of megabytes for one site. Much maybe shared, e.g. going to [newyorktimes.com](https://newyorktimes.com) and [reddit.com](https://reddit.com) may have bits which are similar. So the web browser stores this in a database, and ideally, the database needs to be in memory to make the application seem fast. (When a browser runs out of memory, the machine will now feel very sluggish).

Back in the days when everything was textual – a few MB of RAM was fine. Now we play huge games or open dozens/hundreds of tabs, the inefficiencies are piled up. Application or memory size isnt a good way to measure ‘goodness’ but it is annoying when apps grow and grow, and eventually your nice PC feels like it needs an upgrade.

Anonymous 0 Comments

Since I’ve seen many answers tackle RAM let me tackle needing more or less. One part comes down to how the problem is framed. Let’s say I ask for the multiples of all numbers between 1 and 16 quintillion. That’s easy you just count from 1, multiply each number by 12 along the way and spit out the answer.
Now I ask, I want the multiples of a *subset* of numbers between 1 and 16 quintillion. Although it’s less numbers that requires memory because I have to know which numbers you want.
I can take another approach and say we’ll just multiply all the numbers and I’ll pick out the ones I wanted from the results. This wastes processor time, but you’re back to not needing RAM. This is an example of the space-time problem. Sometimes you just have sacrifice one or the other for your results, and developers choose one or the other based on the environment they’re working in. I’ve worked with old code that had tricks to deal with low memory situations, if we were doing the same thing today it would be a more straightforward solution since RAM is in abundance compared to decades ago.

Anonymous 0 Comments

In general a computer has 3 types of parts: logic cores, storage, and the “bus” which handles moving data between the logic cores and the storage.

Computer storage can be sorted along 2 axies: Cost per bit and response speed.

A general rule is that as memory responds faster to requests for data, it gets more and more expensive.

Now, I am going to walk through an analogy of how the logic core in your CPU sees data in your system that should help with why the various levels of memory in computers exists:

You (as the logic core) are sitting at a table in a library. You have a book open in front of you, there are other books on your table and even more elsewhere in the library.

Now, the book you are holding is the data that the logic core is actively working on, the other books on the table are in a memory type called “registers” which are built into your CPU’s cores and can hold maybe a few kilobytes per core.

You might think that the rest of the library is RAM, but it’s actually something called “cache” which is built into your CPU and is pretty damn fast but maxes out on the order of megabytes.

Now what is RAM here? It’s another, larger library on another continent, which has to mail you any books you request via the post office.

The next question you might ask is “so, what about my hard drive or SSD?” well, that’s an even larger library on the moon, possibly even Mars if it’s a hard disk since that has even worse lag time; either way sending you a book is a laborious process that involves a spaceship and atmospheric-entry.

You can also get magnetic tape storage which in this analogy would be an absolutely massive library on Pluto.

The exact ratios between the timings here are off, but this is meant as an exaggeration to convey an idea.

As to why some programs need more than others, that’s fairly simple: different programs are designed to do different things, and different tasks take different resources to do. It’s kind of like trains, when you plan on having a train carry more cargo you add more cars to it so that there’s space for everything.

Fun fact: on moder motherboards the physical location of RAM is partially dictated by the speed of light because of the length of the copper traces between those chips and the CPU.

TL;DR: RAM in computers serves as a balance between capacity and speed so that computers can access the data they need as quickly as they need it, and programs need to be able to access various quantities of data to serve their functions which depends on what their designed to do.

Anonymous 0 Comments

Dont take the numbers i give as exact, they are just ballpark to help understand the timing.

Simply put, the closer data is to a CPU’s core the faster operations can be run. Data in registers are what a CPU actually operates on, this is very small like measured in bytes. When you need data loaded to a register time is wasted to get it there. So the solution was to make multiple levels where you could hold more and more data but take more and more time to access . After registers the next level is cache on the chip which is usually kilobytes and takes nanoseconds to get. After that you get to RAM which is gigabytes in size but it take a microsecond to access. After that is harddrive which takes milliseconds to access but holds terabytes. Importantly if power is shut off the only one designed to maintain stored data is the harddrive.

Different programs need different amounts of RAM because some things operate on gigs of data like a high definition video game. But others operate on way less, like a text editor.

Computer Sciences studies the complexities of doing the above stuff and ways to do it faster.

Anonymous 0 Comments

Think of a kitchen:

The fridge is the hard drive, it stores things for the long term.
The kitchen counter is memory.
The cook is the CPU.

Can the cook make a meal directly in the fridge? No, the fridge is made for holding lots of ingredients, not for chopping and mixing.

The cook needs the kitchen counter to cook the ingredients from the fridge. It first gets the ingredients from the fridge (hard drive), works with it on the counter (memory), and delivers the meal/result (output).

Anonymous 0 Comments

The CPU can only be used with a program, data that will literally instruct it what to do. The CPU makes a clear distinction between memory that serves as a program and data that will be processed (e.g. “an image”). Physically, it uses the same RAM. A computer has (part of) the program and (part of) the data in RAM.

A computer will use more memory when there is a lot of data to process (e.g. a video) or when the program is big. Sometimes the data is already bundled with the application (e.g. with fonts or error messages) and sometimes this comes from other sources (e.g. the internet or disk).

Then there is the concept of cache. Cache is memory to temporarily store data or program (in order to avoid the costly endeavor of getting it from its slower source like a disk). RAM is often used as cache too.

The result is that, after a while, most computers use all memory available for the above functions: cache, program and data.

Anonymous 0 Comments

CPUs will perform a certain sequence of calculations every clock cycle. We ideally want memory fast enough such that the CPU can retrieve the data from it within a clock cycle.

However, due to material, space and power constraints, we can’t solely use the super fast memory (the super fast memory typically being L1 cache, which can be accessed within 1-2 clock cycles).

Therefore we established a hierarchical memory structure:

1) L1 cache (few hundred kilobytes, fastest)

2) L2/L3 cache (tens of megabytes, extremely fast)

3) RAM (gigabytes, fast)

4) storage drive (terabytes, slow)

As to what the RAM is used for, it’s whatever data the application you have opened need to store, for a web browser it would be webpages (HTML, Javascript, media content), the sandbox environment the webpages run in and user interface icons.

A game would need to store object data, the game’s code, sounds and textures not currently in VRAM but might need to be used soon.

There are algorithms such as First In First Out that determines what data should go where, since we want the frequently used data inside the L1 cache and the less frequently used data inside the RAM.

In large datacenters such as Facebook or Google, they also develop caching algorithms to allocate data between fast SSDs and slower HDDs for the sake of faster transfer to the client. The goal is to predict what data might be needed soon/more frequently and move them to the SSD.

Anonymous 0 Comments

OK, think of how we looked up information before computers.

You have questions and the encyclopedia at the library has answers. An encyclopedia can come in 22 volumes, each a separately bound book.

You have checked out volume “A” which contains all encyclopedia topics that start with the letter “A” and brought it back home.

(I’ll put the computer analogy in parentheses.)

You open the book

(start the program which loads the book contents from disk into RAM)

to an article about Aardvarks. You read that page.

(This is like reading data in the CPU L1 cache.)

The article continues, so you turn the page.

(This is like reading a page of data from the CPU L2 cache.)

But now you are curious about anteaters. So you go to the index and look up which page starts the anteater topic (page 328) and you turn to that page.

(This is like reading from RAM. The more RAM, the more pages you can have in the book.)

Anteaters are interesting, but **pangolins are cool**.

Except you don’t have encyclopedia volume “P” at home. So you go to the library to return volume “A” and ask to check out volume “P”.

(This is like getting data from disk. A bigger disk can store a bigger library of books.)

Except the librarian tells you that volume “P” is already checked out to Chad. You now have to wait until Chad returns that book before you can check it out and read it yourself. That might take a while.

***Chad!***

(This is like downloading data from the Internet where the request comes from your computer, to the ISP and on to some server, and the resulting data is sent by the server to your ISP and on to your computer.)

Knowing about pangolins is worth the wait. But if you had ~~more RAM~~ a giant encyclopedia, you could have had all topics in one volume and been learning about pangolins right now.

Anonymous 0 Comments

You do work at your desk, lots of paperwork. Your desk is only so big, so you can only have so many papers on your desk at one time. If you need more space, then you have to take a paper from your desk, put it in the filing cabinet, and get another paper you need. Things get really slow if you have to do this too often.

Say your workload is some basic accounting. You need a good number of documents to do this, but you can all fit them on your desk. You’re good. You pull all the needed documents from the filing cabinet, you work on them, you put them back, and you get the next set.

But then you get an account for a millionaire with very complex finances. You can’t fit it all on your desk anymore. You have to keep going back and forth to the file cabinet in order to figure it all out. You can do your work, but you could do it more quickly if you had a bigger desk.

Then an electrical engineer takes over your desk. It’s not even big enough to hold just one of his drawings. He simply can’t do his job. Maybe he could cut up the drawings and be constantly on the run ferrying pieces of the drawing between the desk and the filing cabinet, but that’s not a good way to work. He needs much more desk space.

Desk is RAM, filing cabinet is disk.