What is the importance of cache memory on phones and computers, and how does it work?

212 views

[ad_1]

What is the importance of cache memory on phones and computers, and how does it work?

In: Technology
[ad_2]

Cache is optimized for handing off information to storage, or to the processor. It’s like arranging the groceries from your shopping cart in the order you’d like them to be bagged up, keping heavy stuff distributed and like items together, which makes for easier transport and unloading.

In simple terms, cache memory allows the device to hold on to commonly used things. Be it apps/programs, web site data, whatever. Instead of the device having load it new every time from an app/program you use a lot, or download it from the net every time you load a site again, cache memories hold parts of it on hand so that it can grab it locally

Cache is simply temporary data storage. Lets say you have an app which fetches some data from the internet to load a grid of profile cards. Its inefficent to load these items every time you open the page, since the user already has this data from the previous load. Instead, you can store (cache) the data on disk to make load times faster (no need to fetch from the internet) and prevent strain on your servers. Thats one example of how cache is often used.

The processor needs data to work with. It’s slow if it’s ever sitting waiting for data. Hard disks are way too slow, SSD is also slow, even RAM is too slow. So they put memory right on the chip that can be accessed very quickly. But this is expensive so they can’t put too much and just use it as a cache for what it’s currently working on.

RAM is slow. Sure compared to your hard drive or even SSD it’s super fast, but from the processor’s perspective, it’s really slow. Anytime the CPU has to wait for data from RAM it’s just sitting idle waiting. To combat this CPUs have cache built directly into the chip that is much closer and much faster than RAM. Whenever it reads from RAM, it’ll first check if the data is in the cache and if it’s “fresh”, if the data is “stall” or missing, then it goes to RAM, but it’s it is fresh, then it can read the data from cache and save a lot of time. If the cache is successful even half the time, this could provide a huge performance boost to the application. CPUs and Operating Systems put a lot of effort into making this cache as useful as possible to keep the CPU running as optimally as possible.

Cache is a small bit of exceptionally fast storage, the processor can pull stuff from it in under a nanosecond which lets it keep moving quickly

If something isn’t in cache it has to go to RAM which takes around 10 nanoseconds to get anything from, and even getting stuff from the Flash/SSD is 100,000 ns which wastes a lot of time. HDDs are 7,000,000 ns which is absymal.

The cache is there so the processor can have what it needs to keep busy and keep the task moving along, if it has to keep waiting for stuff from RAM or the SSD it spends a lot of time sitting there doing nothing. There’s a lot of work that goes into ensuring the right stuff is in cache for each workload because having to go to the RAM 10% of the time is a pretty big hit on performance.

For scale here, if you’re doing math homework the cache holds the paper next to you (<1 second access time), the RAM is the notebook in your backpack (10 seconds to get), the SSD is your locker at school which is closed for rest of the weekend (27 **hours** away) and anything on the hard drive may as well not exist because its on a small island in the pacific that is only accessible by small sailboat during certain seasons (81 **days** away). If every math problem required pulling out your notebook it’d slow you way down so you’d probably just keep it next to you to save time(caching it), but if every problem required getting a different book from your locker you’d never finish anything in the expected time