[PD] Audio latency on linux

Orm Finnendahl orm.finnendahl at selma.hfmdk-frankfurt.de
Wed May 10 19:04:27 CEST 2023


Hi,

 for a project involving controlled and tuned feedback through the
Audio Interface, we need very low latency (~ 9-10 ms roundtrip through
the analog ins/outs) for it to work properly. We did some tests using
OSX and linux based systems with different audio interfaces.

On OSX I can get down to 16 ms with pd (44.1 kHz sr); using Max/MSP we
can get below 10ms with a vector size of 32.

On Linux the lowest I can get with pd is 29.33 ms with 48k samplerate
(this is using alsa; jack is ~34 ms).

jack_delay reports a 9.6 ms roundtrip delay through the analog outputs
with a vector size of 64, so it is not the driver and in principle
should be possible to get a lower latency in linux. I don't know if
jack_delay is implemented with additional i/o buffers but even if pd
adds an extra period in both directions that should be well below 30ms
(64 samples are 1.33 ms at 48k). The added 20ms in comparison to
jack_delay appear quite large to me.

I even tried to compile pd with a lower blocksize (I changed
DEFDACBLKSIZE in s_stuff.h and DEFSENDVS in d_global.c and can get
down to a vectorsize of 32 without distortion at the audio interface),
but that doesn't change the i/o latency at all. I didn't yet study the
alsa/jack related code.

Can anyone shed a light on this? It'd be much nicer to use pd for this
than writing a dedicated app doing all the dsp (or using Max/MSP on
OSX :-(

--
Orm







More information about the Pd-list mailing list