To people with Bluetooth expertise: Why is the Bluetooth audio experience always so...bad? Are there some fundamental technical limitations in the spec? Are my expectations unreasonable? Or is it just always implemented poorly? Followup: are there competing technologies that Just Work™?
[0]: https://xkcd.com/2055/
[1]: https://support.apple.com/en-is/HT209369 ("move away from places [with] a lot of Wi-Fi activity")
[2]: http://www.iphonehacks.com/2018/04/heres-why-your-airpods-or-other-bluetooth-headphones-cut-out-while-crossing-a-street.html
Guess what people are mainly made of?
So, if you are standing between a BT transmitter and a BT receiver a lot of magic needs to happen to get the data moving between the two.
People love Airpods. People often put their phones in their pockets. A lot of water in the way.
If you notice BT being particularly terrible, look for how water might be blocking the signal.
This, of course, isn't the cause of all BT problems, but knowing this fundamental really helps troubleshoot when you run into a lot of them.
It's the same issue as with practically everything else digital these days. Hugely complicated thousand-page standards that few, if any, understand in their entirety. Implementations that are rushed to market to "be fixed later because it's just software" - which never actually happens because the next thing comes along and instead of fixing existing shit it's again a rush to the deadline to push more of it out the door.
I see what you did there.
It's unfixable.
My honest professional opinion is the people that wrote the spec had no idea what they were doing.
It's easy to see the mistakes they made. If you look at the early to mid 1990's, there were cheap FM radio IC's that drew a couple of ma. At the same time ISM band microwave IC's were expensive, made of GaAs and just an LNA would cost you 50-100ma.
Wifi did exist back then. But the cost was around $150-200 for a chip set and the power requirements precluded battery operation.
So 'of course' they went with a low power FM radio.
The problem was as you shrink CMOS transistor sizes they get faster. By the late 1990's you could build a direct sequence transceiver IC out of CMOS. The one my group rolled out in 1999 drew about 200mA in receive mode and cost $4/1M
Zigbee came two three years later and the power requirements were down to 25-50mA.
Bluetooth designers completely failed to judge the progress being made in CMOS RF IC design. That was completely obvious to anyone in the industry.
The second problem had to do with the baseband. The Bluetooth baseband has several layers and essentially replicates the USB protocol/driver stack. Notice how easy it was for Microsoft to roll out the USB stack? Yeah total shitshow.
So you had a dumb simple frequency hopping radio married to a complicated baseband.
The rub, out of about two dozen companies that tried to build Bluetooth radios' circa 2000, all but three failed. Because they couldn't get the base band power consumption low enough.
Usually when you design a spec like Bluetooth you get your potential manufacturers on board working on silicon as you design the spec. They didn't.
Notable the Zigbee spec had the same problem with it's baseband and most people solved that problem by ignoring it and just using the radio.
A third problem (which effected Zigbee as well) is the transmit power was too low. Made worse with Bluetooth because the FM radio's sensitivity sucks. Back then I knew from cordless phone stuff that you needed at least 5db of transmit power to get reliable QOS. Bluetooth and first gen Zigbee radio's had less than 0db.
All of the above a small brained primate like myself could have told you.
It's worse then you think because Bluetooth is a frequency hopper which means Streaming QOS is limited by the link margin on your worst channel.
Newer Bluetooth variants support higher quality audio without a microphone (A2DP), and that's what you are likely using today unless you are in a call.
As far as interference in the 2.4GHz range, there is not much you can do about it in areas you don't control.
> If I remember correctly, the bluetooth headset profile (if you require microphone input) is pretty low bandwidth (think gsm quality), which would explain low quality in that case.
the headset profile is a bit nicer than GSM (higher rates, more advanced codec), though nothing you'd want to listen to music through.
> Newer Bluetooth variants support higher quality audio without a microphone (A2DP), and that's what you are likely using today unless you are in a call.
A2DP has been around for awhile. as some other comments mention, there are multiple codecs that can be used. the minimum one is SBC (also used by HSP, at lower rates), which... isn't great.
once you're using aptX/AAC, perceived quality is far more likely to be dominated by the electrical and acoustic aspects of your headphones.
My old Moto G could play FM Radio, and you could select Bluetooth as the output. Fast forward to today, and the Samsung A50 (in my hands right now), has the ability to pick the app, and select the output to Bluetooth, but it doesn't work for FM Radio. Why? Because Bluetooth is just not consistent in my opinion.
EDIT: I am probably in the minority of people that insist on FM Radio on their phone. Just putting that out there as most people will not run into my problems.
Right now the A50 has a really nice sounding tuner with either my Samsung earbuds or Sony Headphones. NextRadio appears to be winding down it's features, so you have to use the basic tuner, and manually seek to each station, and add it to your favourites.
All this to say, there's not a lot of attention to FM Radio.
https://www.makeuseof.com/tag/how-to-unlock-fm-radio-hidden-...
https://www.soundguys.com/ldac-ultimate-bluetooth-guide-2002...
BT headphones can and do sound nice.
It feels weird to shut them down and remove them in a room with modest background noise and feel, for a few seconds, that the noise is quite loud.
The only nitpick I have is, after reading the legalese, I decided that there was no way in hell I was going to use the Sony Connect app, but the phones sound so good to begin with that it's unnecessary to install it in any case.
In terms of alternatives, I have heard "zigby" mentioned. Also, aux cords Just Work. And I know this wouldn't work for everything, but I always wish I could have infrared transmission wherever possible. Even if I can't actually hear the transmissions, it still feels very loud to be blasting bluetooth out in every direction.
https://zigbee.org/
My friend who's an electrical engineer is quite fond of it. I really don't know more than his opinion
SBC was made for voice and is acceptable for that. Better codecs like AptX and AAC exists in many devices now, but because of incompabilites SBC is often negotiated anyway.
SBC is not a bad codec. It is a low complexity codec (sort of similar to MP2) optimized for low power consumption and quick encoding/decoding, and I think it's been unfairly maligned, due to devices that are way too stingy with bit rates.
BT music from my iPhone to my car stereo sounds bad. There I cant verify what codec or bitrate is used, but clearly is a difference when comparing to a music file played from an SD card reader on the dashboard.
I also have a pair of Sony bluetooth headphones that I use with my MacBook, here the sound is awful, until I use a BT utility to choose aptX instead, that they also supports, but SBC is default selected anyway.
Also what is it about specific intersections that reliably causes connectivity / stutter issues? Is it something specific to the traffic signalling equipment or spectrum pollution? It happens more in dense urban environments but there are several extremely bland pedestrian crossings in my neighborhood where this happens as well. And then only on certain headphones.
Also the fact that battery reporting is still not standard is annoying.
I notice that the connection is not reliable, even when the phone is in a holder on the dash. I end up using the AUX input through the headphone jack.
I'm wondering if this is a common problem in cars, or if I should have my car looked at.
Fixing it requires going through the whole pairing and unpairing procedure. So same here, I just end up plugging it into the USB AUX.
Toyota seems to be less bad than most. I'm sure some others are also less-than-horrific. Maybe Tesla?
The W1/H1 chips, here after referred to as "Makes BT Not Suck" chips or MBNS, are the reason I put up with the otherwise unremarkable sound quality of Beats Studios and AirPods. I was reminded why just recently as I took the Bluetooth speaker out to the garage so the wife and I could do yoga. "Oh, yeaaaahhh. I have to go find the last device it was connected to, and disconnect it. Then, and only then, can I pair to her iPad."
If I had a HomePod or other AirPlay speaker in the garage, no matter what the speaker was connected to previous, I could have just told the iPad to use the speaker and be done with it.