Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I also don't get why this was not fixed yet. There is clearly the need for high quality Bluetooth headsets.

Two sinks to one source works well for some devices. But this is more an edge case for most people.

I also have the feeling, that Apple AirPods have a better quality if you connect them to an iPhone, then any other Bluetooth headsets. Do they use a proprietary Handsfree profile?



My understanding is that the AirPods and iphone have a high quality codec that isn’t widely supported yet ...


My understanding is that music gets decompressed on the Apple headphones, the original AAC stream is sent over the wire. Most Bluetooth headphones will just effectively transcode the audio, such that for example AAC gets decompressed then recompressed into AptX/LDAC. For music listening, it's completely suboptimal. I use FLAC sources for my Sony headphones which makes them very listenable (along with copious EQ to tame the +5db excess bass). Bluetooth headphones have a lot of room for improvement.


> My understanding is that music gets decompressed on the Apple headphones, the original AAC stream is sent over the wire.

That's claimed at many places, but this one claims the opposite: https://habr.com/en/post/456182/, that everything gets (re-)compressed into AAC on the sending device before sending it to the Airpods, in order to mux other audio events. In principle, with end-to-end Apple hardware, they could send multiple streams and leave the muxing to the Airpods, but I don't think anyone has conclusively shown if this is actually done.


If there's one thing I've learned about technical claims about Apple hardware, it's that they are not to be believed unless they come straight from Apple in a precise and unambiguous statement, or from a reverse engineer/hacker who has looked at the code/protocols.

The reality distortion field is just as strong as always when it comes to technical details too. People will make random things up to prove that Apple is different or (more) special.


Claims of poor Bluetooth audio quality generally fall into the following categories:

* Bad source settings/implementation (i.e. bitrate too low)

* Bad sink implementation/EQ/DAC

* The HSP issue referenced by OP (where you can't have both HQ audio and use the mic)

* FUD by patented codec authors

The reality is that even the basic royalty free Bluetooth SBC codec is perfectly fine, and sounds transparent at high bitrate settings, which decent sources should be using and all sinks are required to support. Transcoding doesn't make much of a difference either. It's a poor codec, but the bitrate is high enough that it doesn't matter. You can ABX test it yourself if you're so inclined, purely in software, with high quality wired audio hardware. I have. You'll see the codec isn't the problem.

So when your Bluetooth device sounds better wired than wireless, or works better with AptX or some other patented nonsense codec - most likely the problem is careless software/firmware (or outright crippling to push patent licenses), not the spec being bad.

(I say this as someone who was gotten into flamewars over the quality of ffmpeg's AAC implementation and found bugs in the Opus reference encoder; I can tell when audio quality drops)


My personal lived, yet anecdotal experience is with a fairly decent Kenwood head unit in my car. I'll often use bluetooth for convenience but from time to time I'll connect over USB. Whenever I make the direct USB connection the sound quality is always noticeably better: rounder deeper sounds, much better quiet and loud sounds ...

This leads me to conclude that one or other of the following must be the case:

- The iphone DAC is x10^9s better than the Kenwood DAC

- Kenwood and iPhone have negotiated a poor codec

> most likely the problem is careless software/firmware

So, don't take this wrong, but I'm going to wince and say this has a shade of a no true scotsman argument to it ... and I say this because I think that this is crux of what's wrong with Bluetooth. It is a very closed technology, and it's very hard to get a leg-up on the standardisation or how to use it for your own devices if you want to do anything any way commercial.

I'm saying this as a Telecomms guy, who is used to having high quality standards documentation, and reference implementations for just about everything I do. Interoperability is key. Though Bluetooth is a communications standard, it seems to have been influenced more by consumer electronics than comms.

After 10-15 years or so of using Bluetooth, it increasingly feels like a technology not really developed in good faith. You get these clunky imprecise results that I and other people report all the time. Working with a bluetooth device you're always going to be throwing the dice vs the certainty of plain old wired connections. It's plain to see that right across the industry people are hedging against it, and anybody that does require reliable M2M short-wave radio is using a proprietary protocol.

Bluetooth is a millstone around our necks. Yes it gives you freedom from wires and a certain limited amount of interoperability, but you don't get a huge amount that you wouldn't have got vs proprietary radio technologies.


I'm not saying Bluetooth audio doesn't suck in practice. I'm saying it's not the codec/technology's fault, but everyone assumes it is, and that is exactly what all the companies peddling patented alternate codecs want you to think.

I have a set of Bluetooth speakers, and they sound better over the wired aux in than wireless. I know what this experience is like, and this is why I have tested the codec myself and concluded that it wasn't the problem. And since Bluetooth is a digital audio standard, if the codec isn't the problem and the data is arriving at the destination, then clearly any quality problem is the destination's fault.

Don't get me wrong, Bluetooth is a horrible standard for many other reasons (it's worse than USB, and that's saying something); I've implemented Bluetooth-related protocols. But "screws up audio quality" isn't one of them.


I’m saying that it’s so vulnerable to codecs, “is” the flaw with Bluetooth. I’m happy you found a combination that works for you ... but what is the point of Bluetooth exactly if you can’t have good interoperability.

In fact, I wouldn’t be surprised if this issue was created just so there could be a “marketplace” for codec pedlars ...


> Transcoding doesn't make much of a difference either.

I respectfully disagree. Transcoding does make an audible difference, at the bitrates typically used, and should have no place in "high fidelity" audio. My issue with these Bluetooth codecs is that they are not used at the sources and so will always be used in practice for transcoding.


Have you ABX tested an SBC transcode at the maximum bitpool settings vs the original? Because I have, and I couldn't tell the difference.

The "transcoding is bad" story is about low quality settings, archival, and/or bad encoders. E.g. don't do repeated transcodes with the ffmpeg AAC encoder, not even at 320kbps. But one final transcode with SBC for over-the-air delivery at the max allowed bitrate? That's totally fine, especially for typical Bluetooth use cases (listening on the go, casual headphones, etc).


Well, I have noticed significant degradation of 128kbps MP3 sources when sent over Bluetooth, whereas my FLAC CD rips sound fine. I don't think that the effects of transcoding are well understood or well-studied, especially the interaction between different codecs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: