When I grew up computers were slow and you had to spend a lot of time waiting for programs and games to load, and the startup time for a computer could be minutes. Since then a long time has passed, and with all the progress in technology and science you would think that todays computers would be fast as lightning.
But still I find myself waiting a lot. Especially on a PC, where programs can take ages to start, and keeps crashing now and then. It’s like it doesn’t matter how strong and fast computers gets, the programs are always a step ahead and too big for the computers to handle.
In: Technology
They are lightning fast, maybe you’re just remembering wrong but when I was a kid some 20years ago it would take me literal hours sometimes to watch a 5 minute video maybe a tenth of a MB in size online with how slow everything was. It took me about 3 days to download games, don’t get me started on maplestory haha. Nowadays I can download 100 gigs in 10 minutes, watch 10 videos at the same time on YouTube in quality so good its like you can reach out and touch it.
The major issues with modern computer games and their wait times are encryption, execution control and phoning home.
When you start a game these days, it attempts to contact a server over the Internet and give that server some data about your experience; usually your mac address, and physical location when you first load but it can be all sorts of data whilst you are playing the game . This data is used to help improve the games and how they are marketed. If the game has trouble reaching its server, it waits for a while.
It used to be that you would start a game and it would load its data and get going, but these days that data is encrypted and secured by the operating system. To get it’s data, your game has to wait for the OS to check its security credentials and decrypt the data.
And then there is execution control where the operating system has to make sure that your game is what you think it is before it even let’s it run and then it will check to make sure it isn’t doing any of the forbidden things as it goes along.
So the computer is faster than it was a decade ago, but it runs safer and does so much more than you can see.
>and keeps crashing now and then
I rarely see any crashes outside of early access software. If you see a lot of crashes, you may have a problem with your set up.
They are insanely faster.
I once got a Joust game I played on a 386. The speed of the enemies wasn’t locked to a timer, but to computer cycles. I later tried to play it on a Pentium computer. The computer was so much faster that the enemies were very fast. I died, immediately, every time.
Remember Pixar’s Luxor Jr. lamp in its logo? The short movie about him? That took a very long time to render in the 1980s, many days IIRC. By the late 1990s a Mac could render that on the fly.
Other than that, we also ask computers to do a lot more these days. The operating systems and programs are far larger and more complex, and with that they do a lot more.
There are two problems here. The first is a trick our mind plays a lot: it suppresses bad memories and makes good memories better. So the computers you remember using when you were young were certainly way worse than you think.
Secondly, do not blame “computers”, but the computer you use. Mine start up in seconds, it plays sounds and compute images in real time that even the most powerful ones could pre render when I was young. And it does VR. Starting an app or a game is mostly instantaneous too, thanks to NVME storage.
TL;DR: you got a potato computer.
Windows sucks. This is in part due to its success, there’s no real competition in the consumer or business pc market. Much of these programs are written in an Object Oriented way, which tends to favor flexibility over performance. This is one potential reason for slow startup times, as each additional feature or layer adds additional cost. Until recently graphics including games were entirely serial, only using a single thread. This didn’t scale to more powerful multicore PCs. Legacy software is often not thread safe, so again a lot of operations may be serial as well, including startup. Concurrency in languages like c and c++ is difficult to use correctly, more so in large projects at megacorps. So it’s a good bet that significant portions of code will not run in parallel, utilizing the hardware to the fullest, unless that absolutely necessary. And even in games, this has only been a thing relatively recently.
> When I grew up computers were slow and you had to spend a lot of time waiting for programs and games to load, and the startup time for a computer could be minutes.
You’re forgetting the early game consoles, they were all instant-on.
> Since then a long time has passed, and with all the progress in technology and science you would think that todays computers would be fast as lightning.
They are. A Raspberry Pi Model B, which is an “old” hobbyist micro-controller, by today’s standards, and you can buy one for less than $10, is more than 5x faster, using the same benchmark, than a Cray 1 super computer from 1975.
> But still I find myself waiting a lot.
Your software is bloated.
Wirth’s Law (1995, “A Plea for Lean Software”) states that software is getting slower more rapidly than hardware is getting faster (Hans Reiser actually said this earlier, and get’s the credit in the original article, but Wirth gets the credit today because Hans murdered his wife and no one wants to be reminded of that or remember him – brilliant software and algorithms, but no one will touch anything with his name on it).
Herb Sutter wrote in 2005 “The Free Lunch is Over”. He wasn’t writing about the end of Moore’s law, but the end of the miniaturization of CMOS technology (computer chip technology) which inherently lead to faster processors (because smaller geometries and shorter distances allow for higher frequencies and lower latencies, but heat dissipation, leakage, and quantum effects become significant limiting factors). Moore’s Law still holds because they’re just making chips bigger by adding more cores, rather than making existing pipelines faster.
Jevon’s Paradox states that the cheaper and more plentiful a resource becomes, the more it gets used. This applies to all things and doesn’t come from computing. For example, make cars cheaper, and more people will buy and use cars. Here you are, trying to save the world by increasing fuel efficiency, you end up driving more consumption of fuel. Likewise, you make computers faster, people will write fatter software.
Parkinson’s Law states that work will expand to fill the time to complete it. This is an old adage about bureaucracies, but it also applies to how software will expand to fill all available memory. Just look at your web browser – here’s a piece of software that is orders of magnitude more complex than an operating system, and it’s more performant to just allocate more memory than it is to use more efficient memory management.
—
There are text editors out there that are 4KB or less. The whole thing fits in a tiny corner of your CPU’s 48KB L1 instruction cache. We’re talking file IO, copy-paste, menus, mouse and keyboard support – they do everything any other basic editor would do, something slightly more robust than Notepad. Notepad is about 1.5 MB, BTW… Why so fat for the same or less?
That’s not the worst of it. Electron is a framework for writing GUI applications, it was all the rage a few years back. A popular text editor called Atom is written in it – just to open the application, 650 MB of memory. The framework is monstrously fat, and unstable, but it allows the developers to produce a lot of functionality quickly.
I can go on and on, but the problem is this is a phenomenon, it just happens, not because of technology, but of humanity. We do this.
You can always run leaner software, but would you notice the difference? You have expectations, and so long as the software fits those expectations, then where’s the problem? Right? And how fast and lean do you want to get? And what are you willing to sacrifice to get it? Your GUI desktop takes gigabytes of memory, but we’ve had windowing systems since the 1980s that fit in kilobytes. You can run Linux and XFCE which will get you down to sub-gigabyte levels, if you want. I do.
What drives me nuts is USB keyboards. There is a substantial lag between when you press a key, and when the text shows up on the screen. I write code all day. I notice. Go ahead and type a lot for a couple years to get acclimated, then go to a 1980s Commodore64 and try the same. The whole hardware and software stack from the physical key to the screen is entirely different, and measurably, noticeably faster. USB is fat and laggy as shit.
—
When I was working in high speed trading, we did some stuff to make the software fast, but not really a lot. You’d be surprised. But we would drop $5k for a new network card that would spare us 600 nanoseconds. Why? Because developer time is expensive, hardware is cheap. That card cost lest than 2 weeks developer salary. I could spend that time making the software faster, but the next feature addition could eat those gains and then some.
I dunno, man, for me there’s an indescribable intuition about this. We endeavor to manage the expectations of humans, specifically our management. There are more costs to developing software than just clock cycles. We can absolutely make small, lean, fast software, but that’s rarely ever the end goal. We’re not writing world class software that is going to endure, we’re writing business software for today to try to capture market share. Why invest expensive developer time when we can scale in other ways, when we can temper expectations, when we can exploit those expectations? Maybe a faster, leaner Notepad would capture customer attention, but can we hold onto that performance profile? That market? Those expectations? Would we really hurt the competition that much? There is a lot of amazing software out there that is absolutely superior, but no one is using it.
Oh, there’s another one you want to google, “Worse is Better”. It has to do with software acceptance. Linux and Windows are both hot pieces of garbage. Really. There are absolutely superior operating systems out there, but they never took off, because although better, they’re not that much better to warrant a migration of existing software and expertise and expectations and usability and practicality. What good is good software if no one knows how to use it, because the market is already established?
**Please read this entire message**
Your submission has been removed for the following reason(s):
Loaded questions, **or** ones based on a false premise, are not allowed on ELI5. A loaded question is one that posits a specific view of reality and asks for explanations that confirm it. These usually include the poster’s own opinion and bias, but do not always – there is overlap between this and parts of Rule 2. Note that this specifically includes false premises.
If you would like this removal reviewed, please read the [detailed rules](https://www.reddit.com/r/explainlikeimfive/wiki/detailed_rules) first. **If you believe this submission was removed erroneously**, please [use this form](https://old.reddit.com/message/compose?to=%2Fr%2Fexplainlikeimfive&subject=Please%20review%20my%20thread?&message=Link:%20https://www.reddit.com/r/explainlikeimfive/comments/oony7p/eli5_why_arent_computers_faster/%0A%0APlease%20answer%20the%20following%203%20questions:%0A%0A1.%20The%20concept%20I%20want%20explained:%0A%0A2.%20Link%20to%20the%20search%20you%20did%20to%20look%20for%20past%20posts%20on%20the%20ELI5%20subreddit:%0A%0A3.%20How%20is%20this%20post%20unique:) and we will review your submission.
Latest Answers