ARM isn’t really revolutionary. It’s been around for almost 40 years. It was originally used on the BBC Micro and other computers, in the era of the Commodore 64. Then they started working with Apple to develop a new version for its portable Newton portable digital assistant, and that’s when they came to be called ARM.
From here on, ARM focused on portable systems where performance isn’t as important as efficiency. It was used in a few computers early on, but the real focus went to portables where energy efficiency and low heat is important. x86 and others like DEC Alpha went the opposite route, maximizing performance (far faster than any ARM), but also running very hot and using a lot of electricity.
And there ARM stayed for years, with incremental performance improvements aimed at low-power applications such as embedded and portables. They licensed their architecture to whoever wanted, and they could build chips based on it. This was called the “core” license. You can make your chip and slap a standard ARM core in it. The first iPhone used an off-the-shelf ARM chip made by Samsung.
Then Apple came back again and got a license to not only build chips based on it, but to create their own wildly different chips that modified the core, an “architecture license.” Apple bought some chip design companies and put a lot more effort into making ARM faster than ARM themselves ever did. Eventually it was a phone that could compete with lower-end laptops. Then they decided to make a laptop/desktop version of it to compete with x86.
TL;DR: x86 came from the position of not caring about power consumption, but later tried to build some power savings in. ARM came from the position of caring about power consumption, but being fast enough, and later worried about performance. And now that our world is centered around portables that need batteries, ARM’s philosophy has become very popular.
Latest Answers