Initially they were just a bit of hard plastic with sockets that allowed you to connect various discrete components together. So to create a computer you’d start with a mother board and then plug in a chip, some ram, a power supply, a graphics card, a sound card, a hard drive controller card, a modem card, a printer port card, a network card and a serial port card. The motherboard was really just tracks that connected the sockets together. You bought a bunch of components and plugged them into the motherboard and you had a computer.
Over time the motherboard manufacturers started bringing the card features on board, starting with serial and parallel ports, then hard drive controller, modem and sound card. But what they were doing was taking all the elements of the card (transistors, chips, resisters etc) and putting them onto the motherboard. So the motherboard began to look like a batch of cards laid on their side. Nowadays they still have sockets for graphics cards, ram and cpu, but almost everything else is now laid out on the motherboard. (and some motherboards have graphics, ram and even cpu integrated onto the board so all you have to do is connect a power supply.)
That makes creating a computer easier – fewer things to assemble – but it also makes it harder to upgrade or repair. If a new sound card with fancy features arrived you could just pull your sound card out and plug the new one in to upgrade your sound, if your hard drive controller failed just pull it out and replace – nowadays these aren’t options. You would have to replace the whole motherboard
Latest Answers