Bluetooth audio latency, codecs, when does it matter

These are some random-ish observations regarding Bluetooth audio latency vs "codecs". This started life as a long comment trying to untangle a messy thread in Wirecutter's review of bluetooth speakers, but since it covers a lot of things that I periodically forget about, I'll have it now as a refresher for myself next time I'm checking BT headphones.

A couple of pre-clarifications:
  • Different people measure different things, sometimes without even realizing, so it's important to make sure that one understands what a number means. Algorithmic delay? Hardware (+firmware) implementation? Hardware + software? Which software: OS + encoding + decoding, or just the OS piping things around? Is there some app involved? Each layer not only adds delay (lag, latency), but also variation (jitter).
  • A "codec" can mean different things. In can be a hardware DAC/ADC, or it can be converting between 2 digital streams (and therefore can be implemented in software). If anything I will be using the word "codec" referring to the second type. A DAC will have latency measured in microseconds, while a codec will have it much longer, well into the milliseconds – even if implemented in hardware.

As an initial reference, even a wired headphone on an iPhone shows latencies over 60ms - from touching the screen to hearing a generated sound (so HW + OS + app) [1]

"Music-quality" Bluetooth audio has in general about 200+ ms of latency, highly dependent on the whole system. Particular codecs have some influence, but their effect can be dwarfed by the rest of the stack [2]. Silicon Labs cites "around 100-150 ms" for SBC [3] (interesting that it's such a big range!); but that is just HW.

SBC is the standard Bluetooth "music quality" codec. It is very flexible, so different manufacturers can customize/mangle it in different ways and make it better or worse while still being SBC. [4]

But 200ms is huge! How didn't I notice?

200 ms of audio/video latency is perceptible and annoying. How is it possible that we all are watching video with BT headphones without even realizing? How could BT headphones ever take off, to begin with?

Turns out that OSes try to compensate for it, and that is why you usually don't even realize nor care about the latency. If you are playing video, the OS can delay the rendering of images for 200 ms, and you will never notice the audio out of sync... IF your BT speaker is not doing something very wrong.

If you are just playing music, you can only notice the lag when you start or stop the music, and chances are that the biggest source of lag will not be BT anyway.

But the moment you interact with the source of the audio (like in a game or like typing in a keyboard), you will be able to notice the delays.

"Voice-quality" Bluetooth audio (headset profile, CVSD codec) has much smaller latency, which makes it usable for videoconferencing, even if at 8KHz, traditional-telephony audio level. Silicon Labs gives around 30 ms. Interestingly, the newer mSBC codec allows 16 KHz at almost the same delay [3], but still mono.

When does latency matter?

So a low-latency SYSTEM (not codec) is only needed for "music-quality" audio for interactive use, like gaming.

What kind of system would that be? Silicon Labs and Qualcomm report aptX Low Latency at 40 ms – so, slower than the standard CVSD and mSBC but higher quality. Interestingly, Silicon Labs also says that it is unsupported given how rare those sources are [3]. [4] says aptX LL seems to not be supported anywhere, except on bundled sets of transmitter + receiver. Which makes sense, because as seen different manufacturers can add wildly different latencies, so if one wants a well defined latency, better to get a whole package that guarantees it. Also, this kind of bundling allows the manufacturer to shave off any flexibility that would add latency. It's a bit of a vicious/virtuous circle, depending on the angle.

Anyway, aptX LL seems to be "obsoleted" as of 2019 by AptX Adaptive. [5]

iOS devices do NOT support aptX, but Macs do. Apple devices in general support the AAC BT codec for better quality, which can be problematic for example if one uses Airpods with Android phones, which seem to do a rather bad job at it. [2]

Another interesting case where latency matters is when one is trying to use multiple, different Bluetooth audio devices. If each device has a different latency, the OS can't compensate! One can easily see the problem by using macOS's included Audio MIDI Setup.app to create an aggregate group of multiple BT devices and playing music. Each speaker has a slightly different delay.

Still in macOS, one can download the Additional Tools for XCode, which include the Bluetooth Explorer app, which allows one to play a bit with the BT stack. One of the settings in the Audio Options dialog is Device latency, but I don't see it doing much, and only one set of options is presented even if multiple audio BT devices are connected. Oh well. Maybe macOS Catalina...?

Final thoughts

This is a good example of how different manufacturers can do lots of stuff to make 2 similar-looking devices inscrutably different. Or how a single manufacturer can have 2 different devices that are unexplainably, weirdly different. Or a speaker that works better with Android than with an iPhone, or viceversa.

Even further, a simple device update can change wildly its response [6].


[1] https://stephencoyle.net/latency
[2] https://www.soundguys.com/understanding-bluetooth-codecs-15352/
[3] https://www.silabs.com/community/wireless/bluetooth/knowledge-base.entry.html/2015/08/06/audio_latency_withb-7BGG
[4] https://habr.com/en/post/456182/
[5] https://www.soundguys.com/android-bluetooth-latency-22732/
[6] https://superpowered.com/latency

No comments

Post a Comment