Why is Bluetooth so much flakier than USB, WiFi, etc?

604 views

For ~20 years now, basic USB and WiFi connection have been in the category of “mostly expected to work” – you do encounter incompatibilities but it tends to be unusual.

Bluetooth, on the other hand, seems to have been “expected to fail or at least be flaky as hell” since Day 1, and it doesn’t seem to have gotten better over time. What makes the Bluetooth stack/protocol so much more apparently-unstable than other protocols?

In: 7688

18 Answers

Anonymous 0 Comments

I used to work in a radio test house, I’ve tested hundreds of Bluetooth and Wi-Fi products. Overall, Bluetooth (now) is a great protocol, that is forced into shit situations.

The 2.4 GHz spectrum is quite crowded. Wi-Fi, Bluetooth, ZigBee, baby monitors and many other protocols use it (so do some microwaves). The reason why is that it’s a worldwide unlicensed band, meaning that it’s legal to use 2.4 GHz in North America, Europe, China, or anywhere. Compare this to cellular, which will automatically switch bands when you go to different countries, that’s a lot more complicated.

Within the 2.4 GHz band, there’s not really any rules about what’s needed to coexist with other products (unless you’re a medical product in the US). There’s regulations, but they aren’t concerned about saying what happens if Bluetooth and 2.4 GHz Wi-Fi want to transmit at the same frequency at the same time. The opinion of governing bodies is if you can’t handle coexisting in unlicensed bands, then use licensed spectrum. Overall, this means each protocol has its own way of dealing with getting data through.

For Wi-Fi, it buffers. Wi-Fi can transmit between 1Mbps and 65Mbps (speeds can go much higher, but this is what’s doable on one channel and one antenna at 2.4 GHz using 802.11n). Meanwhile, Bluetooth only operates between 125kbps and 3Mbps, and if you’re using something with a small battery like earbuds, then 2Mbps is likely the max. Overall, this means that Wi-Fi is much more likely to have more buffered than Bluetooth, and therefore you’re less likely to have interruptions. Wi-Fi also uses much more power on average, and if you’re using more power, you’re more likely to get those higher data rates.

How Bluetooth deals with the crowded spectrum is frequency hopping. Basically, Bluetooth scans for open channels, identifies them, and hops between them. However, if something suddenly appears on that channel, like Wi-Fi, it can ruin that transmission. It’s also possible there’s just not really any open channels and Bluetooth just needs to do the best it can.

Bluetooth is also very cheap. You can buy a pre-approved module, slap it in a product, and boom, now you have a “smart” product. Or at least, that’s how it’s sold by module manufacturers. In reality, you can do 100 things wrong when installing your module. You could put it in a metal box which kills transmission power. You can slam it between other circuit boards which causes wicked harmonics. You can put it right next to an LCD screen which causes radio desensitization. These problems exist for a Wi-Fi module too, however, since Wi-Fi costs a little bit more, if someone uses Wi-Fi they’re not cheaping out everywhere, and it’s more likely they’ve hired qualified engineers and know how to alleviate the issues I discussed previously.

The other area where Bluetooth suffers is that, even in very well designed products, like airpods, it’s being forced into crappy situations. The body attenuates 2.4 GHz quite seriously, and, airpods are very small and closer to the body compared to the wireless headphones of 20 years ago. Also, they’re not communicating with a laptop that’s in front of you, it’s communicating with a phone that’s on the other side of your body in your pocket. Often you’ll find that if you’re in a normal sized room, you’ll be fine, but when you’re outside you drop communication. This is because Bluetooth is echoing off the wall to communicate with the earbuds.

Bluetooth has adapted in the following ways over their generations: they’ve upped the data rate, they’ve released Bluetooth Low Energy (which is basically a whole new protocol designed to save battery life), they’ve introduced long range mode which goes down to 125kbps so you have a better chance of getting something through, and they’ve worked with cell phone manufacturers to get Bluetooth testing to be more self-regulated.

Anonymous 0 Comments

I’ve been designing Bluetooth products for 10years and love answering this question. As a protocol it’s fine. Could maybe be more secure, but overall it’s a great protocol. The problem is a lot of companies/engineers think Bluetooth is easy and add it to a lot of their products. But a lot of care is needs to make Bluetooth rock-solid. I recall even the iPhone 6, they didn’t use the right capacitors to load to a crystal. Basically making the timing flakey on that phone. Bluetooth is a timing based protocol, so bad timing + bad connections.

TLDR: Bluetooth protocol is fine, it’s just most assume it’s easy and don’t take the time to do it well. These companies pollute the reputation of a great protocol

Anonymous 0 Comments

Do people have problems with Bluetooth alot? As long as I’m within a reasonable distance and there isn’t a thick material obstructing line of sight I’ve never really had an issue.

Anonymous 0 Comments

To piggy back of the OP question: when there is a connection issue with Bluetooth – the reason is often a toss in the air?🤷‍♂️ meaning, that its hard to troubleshoot 1-1 considering how many devices that can use bt + the potential stuff that may affect the signal?

Anonymous 0 Comments

Many reasons: GFSK, low power, overly complicated stack. But honestly, the software is worse than the hardware though. And it’s all block boxes, so you can’t blame the developers, they are flying blind. The litigious Bluetooth SIG deserves most of the blame. If there were an open source alternative, it would probably be 200% better.

Anonymous 0 Comments

Radio interference is the worst. Everyone simultaneously trying to speak over a walkie talkie is the problem. Radio interference is why Bluetooth has issues. WiFi reach farther with 2.4GHz but has more channels to use with 5GHz. Too many WiFi devices in the area disconnects them often. Maybe you don’t notice them or you have a good WiFi adapter that adds buffering to hide disruptions. The 2.4GHz spectrum is allowed for many consumer devices that the FCC doesn’t regulate. Hence the local carriers and cable companies use 2.4GHz for last mile commercial purposes that steal what little bandwidth is remaining. Sometimes your ISP uses your WiFi Internet modem to serve other people. (I’m sure they could be forced in court to stop doing this as it uses more of your electricity.) Some 2.4GHz WiFi APs have started implementing Bluetooth co-existence mode to detect low bandwidth devices and allow them to speak but not everyone is this polite. Your 2.4GHz wireless landline phone or your 2.4GHz wireless mouse could interfere with your Bluetooth. Even a cheap USB3 data cable can interfere with devices near it in the 2.4GHz spectrum. The solution is to demand the FCC to open up more wireless spectrums for end-consumers, demand commercial entities can’t use consumer spectrums, and demand lower power. Lastly, demand the FCC to monitor radio pollution.

Anonymous 0 Comments

The main reason why Bluetooth is more flaky than other protocols is because of its design. Bluetooth was designed to use very low power, as well as to be small in size. This means that the antenna used in Bluetooth devices is smaller than the ones used in other protocols, making it less efficient and more prone to signal loss.

Additionally, Bluetooth is a real-time audio protocol, meaning that it cannot tolerate any buffering or lag. As a result, if the signal is lost for even a split second, the audio will drop. Other protocols like USB and WiFi can buffer data and can tolerate some data loss, making them less prone to signal issues.

Anonymous 0 Comments

Having worked on devices that implement the Bluetooth spec, here’s the real answer: Bluetooth would work flawlessly if everything just implemented the spec.

The problem is: Nothing implements the spec correctly.

If you make wireless ear buds, for instance, and make sure they follow the spec to the letter, everyone will complain about your device because they won’t connect to anything reliably.

So you have to figure out the common part of the spec implemented by most devices, and the common way they violate the spec, and you implement that. And there’s always variance because no device completely agrees on eBay they’re doing, and so there’s a lot of errors in every negotiation, and they differ from device to device.