There are a few different things that can mean. A microchip, also known as an integrated controller or IC, can be designed to do a specific job, like converting an analog signal into digital, or making a series of voltage pulses at a certain rate, or anything else the designer needs.
Often when designing a very complicated IC, engineers will use a “hardware description language” such as VHDL. This means they’re writing code, in something very much like a programming language. Some software can then “compile” that code, into a design for a microchip. That design, if it is etched onto a little bit of silicon in just the right way, will be a chip which does what the designer wanted.
So that’s one way of putting code into a chip. Writing code which literally describes how the tiny little etched-in components should be hooked up to each other, and how they should be laid out on the silicon.
Then, there’s microcontrollers. Microcontrollers are IC’s which don’t do a *specific* job, they’re designed to be able to run software which can tell it to do all kinds of different jobs. A microcontroller usually has some rewritable long-term memory in it, which can hold a program.
Instead of designing a brand new IC from scratch, in order to build an electronic device many engineers will elect to write code for a microcontroller. In this case, after writing the code, the designer will then “flash” the microcontroller, which is slang for writing their program into that long-term memory. This can be done using a device which plugs into your computer’s USB port, and has a socket to plug the controller into. Once the chip has been flashed, then it can be removed from the programmer device and installed into whatever you’re using the chip for.
Latest Answers