eli5 Who decides what USB type we use?

161 views

When USB-c came out the whole world just shifted to it seemingly overnight to me – how does this happen?

In: 2

5 Answers

Anonymous 0 Comments

Engineers. If you’re making something that needs a USB, might as well pick an interface that’s robust and user friendly. USB-C is a good option currently.

Anonymous 0 Comments

The specifications are released to developers months before the first products come out with the port. If the port offers significant advantages over its predecessor then everyone will want one on their new products because they want them to be seen as leading edge.

Anonymous 0 Comments

The members of the USB Implementers Forum are the people. Back in the mid 90s when companies were developing a universal device communication protocol started the forum to set the standard. The folks who make up the forum are engineers and designers from the big tech companies you expect to see hardware from and folks from the companies who make the chips inside those devices.

More simply: a buncha nerds decided a couple decades ago they would be the ones to decide then made a club about it.

Anonymous 0 Comments

Manufacturers decide. Then it snowballs into others following suit and boom, it becomes a de-facto standard. This usually happens when the new port/standard is better than the old one, allowing for faster data transfer rate etc. Think the format wars of the past, like when we switched from VHS to DVD, or LP’s to CD’s. Sometimes (and very often in the past), it ends up failing and falls into records of tech failures. Micro usb 3 for example never became popular due to it’s size and relative fragility.

It’s not just USB but many ports and tech related stuff. Things like the pen mouse, DIN connectors, etc.

For example, all keyboards today are a copy of the IBM model M keyboard. Before that, the key arrangement was all over the place and the only standard was the QWERTY arrangement of the letters.

There are moments when manufacturers also band together and decide on a standard to avoid consumer confusion and a format war. HDMI vs Display Port is a perfect example.

Anonymous 0 Comments

USB-C has a lot of advantages that USB-A couldn’t really provide. USB-A cables have 9 wires in them, and they are actually two USB ports fused together (if you look inside of a blue USB port, you’ll see two rows of pins. One actually connects to USB2 devices and the other one connects to USB3 devices. This is how they maintained backwards compatibility on the same port. Believe it or not, the computer actually literally thinks that they are two separate ports at the electrical layer!)

This split-port design meant that a USB port only really had 5 pins that could be used for any modern standard that was more recent than USB 2.0 (the other 4 pins were dedicated to supporting old USB2 devices and couldn’t be repurposed). This was limiting in terms of what they could develop the port to do. Things like displays, eGPUs, external SSDs, and other such devices require very large amounts of bandwidth, and USB-A was limiting from this front. USB-A also wasn’t designed to carry large amounts of power required to fast-charge cellphones, computers, or other devices.

USB-C came along to solve many of these concerns and has 24 pins, which gives it MUCH more flexibility and allowed thunderbolt, displayport, USB3.2, and other more recent standards to work over it. It also has the added bonus of being much more durable, less of a hassle (cables can be plugged in either direction), and it unified different kinds of ports into one port.

In short, there were a lot of different reasons to make the switch. USB-C was much more future proof and it could do things that USB-A couldn’t do.