For ~20 years now, basic USB and WiFi connection have been in the category of “mostly expected to work” – you do encounter incompatibilities but it tends to be unusual.
Bluetooth, on the other hand, seems to have been “expected to fail or at least be flaky as hell” since Day 1, and it doesn’t seem to have gotten better over time. What makes the Bluetooth stack/protocol so much more apparently-unstable than other protocols?
In: 7688
Bluetooth uses the same frequencies as 2.4ghz Wi-Fi so if you’re using both the networks will clash and make the Bluetooth signal flake, Wi-Fi is a stronger signal so it wins over Bluetooth and so it doesn’t degrade as bad. Also the effects of stuttering Bluetooth are going to be more noticeable than stuttering wifi
Usb is wired so it’s pretty much guaranteed to have data be sent and received without interference. Wifi and Bluetooth are wireless where they can be messed with easier. Think of someone yelling you can hear them sometimes. Are other people talking, are they yelling far away, are they whispering rather then yelling? You being able to hear them is subject to other factors causing issues. Usb is basically like your calling coming over a wired phone, it’s going to work with less issues.
Wifi versus bluetooth is more of someone is yelling louder and has better ears versus someone whispering and has okayish hearing. Bluetooth is meant for short distance low energy transmissions for the most part, so it’s just harder for people to hear what someone said when your whispering.
Many others have said it, but Bluetooth is not just one thing, there are versions from 1 to 5.
Like Wifi has 2.4 and 5Ghz, Bluetooth can similarly be considered Pre Bluetooth 4, and 4+.
Bluetooth 4 and above can transmit much more data AND consumes lower power AND handshakes better.
Older and cheaper devices often do not use Bluetooth 4, and it requires both the sender and receiver to have the correct hardware.
Also, Bluetooth is low power on 2.4Ghz, the microwave spectrum. No joking, it is the frequency that microwaves work at too. This is why when you run a microwave, it if is nearby or between you and the device, it drops out. (this also affects 2.4Ghz Wifi).
Why is this? Well, the FCC has said that any devices can use 2.4Ghz without a lot of regulation, so a lot of devices can cause electromagnetic radiation at 2.4Ghz. Microwaves, cordless phones, wifi, bluetooth, and most “wireless” keyboards that are not bluetooth.
Wifi solved this by adding 5Ghz, a spectrum that can also be used, but not nearly as common.
Bluetooth is a set of many incredibly complex protocols, often implemented with poor testing resulting in many bugs.
Then, if it’s a bug affecting a popular device, some other manufacturers intentionally build their devices so they’re compatible with the bug… resulting in them being incompatible with bug-free devices.
In addition to that, many Bluetooth devices use less transmit power.
One of the reasons bluetooth has gotten better over time relates to improvements to microprocessor performance.
Modern mobile devices have more processing power which allows for more sophisticated data compression algorithms. These new algorithms allow for bluetooth data to travel longer distances using less power without dropping packets, while also freeing up a device’s processor to perform more complex operations related to noise reduction and other stuff. The Bluetooth spec was just updated last year actually with some significant improvements, so you’ll soon see much less flakiness in Bluetooth performance (especially related to audio) provided you’re using a modern device which supports Bluetooth 5.3 and above.
Latest Answers