Long ago, most electronics were made with chips that could perform command steps, but they weren’t full computers — microcontrollers. Think of the coffee maker with a timer and screen. It’s just a very basic, cheap chip that can do time, trigger on/off, make a beep, etc. It’s kind of hard to say these “boot” the same way a computer does because there’s no real operating system to boot. They just execute a few commands when power is applied (like turning on the screen and displaying the time), wait for further inputs, and update the time.
It was cheaper to make chips like this than to waste resources putting full microprocessors in them, because those were much more expensive. But these days we have cheap ARM chips, the same thing in all of our phones, but far less powerful. They’re in everything, especially if something’s a “smart” product. These boot an operating system, even if a very small and simple one.
I had an earlier LCD TV. It didn’t have a full computer chip, it just had chips and other electronics that did specific things to turn signal into pixels and sound. It turned on and displayed the picture almost instantly. Today, any TV is going to have an ARM chip and an operating system, so it has to boot before you see anything. To make it look like it doesn’t boot, usually turning them off only blanks the screen and turns off the speakers, but the operating system is still running. That’s why you see in the manual that there’s power draw when it’s turned off.
Latest Answers