cpal icon indicating copy to clipboard operation
cpal copied to clipboard

Reviewing and deciding upon a default buffer size strategy for ALSA

Open mitchmindtree opened this issue 5 years ago • 5 comments

#401 finally allows users to specify a fixed buffer size, allowing some control over the trade-off between energy efficiency and low latency.

The BufferSize::Default value allows the user to fallback to whatever the default buffer is provided by the host. However, ALSA is a bit of a special case in that the default size can be extreme, sometimes implying a delay from multiple seconds up to a minute - an impractical delay for the majority of applications. As a temporary workaround, #401 has opted to leave the BufferSize::Default branch to select a ~100ms delay to match what was previously hard-coded. I've opened this issue so that we can discuss a more suitable default strategy for ALSA.

See here and here for prior discussion on potential strategies. Of note, @sniperrifle2004 made the following suggestion:

As an alternative suggestion to the lowish 100ms aplay does this by default: Get the maximum buffer time. Cap it at half a second. Then restrict the period time to near a quarter of this buffer time (It's where I got that metric from. aplay has been immensely useful as both a reference implementation and a tool) and then restrict the buffer time to near the chosen value.

mitchmindtree avatar Jul 14 '20 12:07 mitchmindtree

What latency would BufferSize::Default typically result in on other platforms? That could be what we would typically cap the max buffer time on ALSA, ensuring somewhat consistent behavior between platforms.

Then restrict the period time to near a quarter of this buffer time

Two periods are usually sufficient, especially on high latencies (hundreds of ms), and means less wakeups.

diwic avatar Jul 17 '20 15:07 diwic

Two periods are usually sufficient, especially on high latencies (hundreds of ms), and means less wakeups.

That is true. aplay takes a conservative approach here. However, since the user is unable to affect the number of periods in the buffer directly, I think some "just in case" fallout protection is appropriate.

sniperrifle2004 avatar Jul 17 '20 16:07 sniperrifle2004

What about enforcing a limit for the acceptable range à la: BufferSize::Default(min, max) That would still allow to select a value the system feels comfortable with but without accidentally ruining the experience. Using the max range would still allow to disable this limitation.

kawogi avatar Sep 12 '22 23:09 kawogi