• SONAR
  • How is the status and history of PDC in Sonar ? (p.2)
2018/04/24 19:25:48
iRelevant
chris.r
So, if I'm used to turn on/off effect plugins during playback to hear the difference they bring to the sound, but the process isn't smooth and gapless, some sort of clicks and pops do appear quite often, and sometimes plugin's getting out of sync (you hear it clearly sounding wrong and need to stop playback to resume), isn't a kind of bug or some sort of flow in DAW's engine architecture? I always thought it's related to the way PDC is working in Sonar. Do other DAWs not having the same issue?
 
Maybe I need to change my habit of using the on/off button in FX bin, to the plugins bypass button. Problem is that not every plugin has a bypass button, and some have it working in an aberrant way.


I don't know how the Audio Engine in Sonar works, but I imagine it is a fairly complex operation to calculate the Plugin Delay Compensation for a project. If you disable a plugin, I guess PDC needs to be recalculated for the entire project ... which might cause dropout during the process. I could also imagine any plug-in changing it's latency during operation ... say when a bypass switch is pushed ... could be problematic. 
Best approach for your case is probably what Mr. Anderton is suggesting.
2018/04/24 19:34:21
iRelevant
mister happy
Turning a plugin on and off only changes the observed latency if the plugin is latency inducing. Not many plugins are, and the ones that do usually have options to minimize or turn off the latency.

I'm wondering what a "latency inducing" plugin is ? 
 
mister happy
[...]
I have long been surprised by the lack of tutorial info made available by DAW makers and DAWxperts regarding the initial setup and testing of DAW synchronization. It seems like one of the most important aspects of multi tracking, but is rarely mentioned. 

Well, I consider it to be the dark secret of digital audio :) Not much talked about. 
2018/04/24 21:15:00
azslow3
I do not see PDC as complex. Plug-in (i) reports that it will create output for the input with delay X(i). After summing, each strip output is delayed from the input by (X0+X1+X2...Xn). When these outputs are mixed into some bus, "faster" strip should wait for "slower strip", so the mix is calculated from the output corresponding to the same original input. The bus can it turn add more delay, so summing different buses should be also synchronized.
 
F.e.
 
            |-> Bus 1 (0 delay)     -->  |
Track    |                                      | --> Mix to Bus 4 (0.2s delay)---|
            |-> Bus 2 ( 1s delay) --->  |                                              |
            |                                                                                     | --> Mix to Bus 5
            |-> Bus 3 (0.3s delay)   -------------------------------------|
 
To mix things correctly, we need to:
* add 1s delay to Bus 1
* add 0.9s delay to Bus 3 (Bus 4 output is 1s + 0.2s, original Bus 3 delay 0.3, 1.2-0.3 = 0.9)
 
There are many ProTools videos where these "extra delays" should be added manually...
All (?) DAWs can add them automatically now.
 
Note that "Track" can be one or many since original material is assumed to be "in sync".
 
Sonar route audio in "real time", that means you hear the audio after 1.2 seconds it has entered the engine. That is problematic for live monitoring, so there is an option to disable the compensation: "Live input PDC Override". But that is also a problem for mixing, if you change some parameter of Bus 3, you will hear the result not before 0.9 later. So "mastering" plug-ins (which introduce such huge delays) should be avoided on any intermediate tracks/buses.
 
Some DAWs (Reaper, S1, other ?) use a "trick". They can "pre-calculate" a part of routing tree without live inputs. So the output of any such (sub)tree has always 0 (zero) effective delay, independent from which plug-ins are used. In this case, "PDC Override" is not required. In the example, Bus 3 changes will be delayed at MOST by 0.3s, so by own delay which can not be avoided.
The trick has yet another effect: if we pre-calculate ALL recorded material (f.e. 50-200ms), the processing is not strictly bound to the "real time" required by ASIO to avoid under-run. So it is possible to have the same "stability" as with huge buffers (2048, 4096 up to "normal" Windows Audio huge buffering) while keeping ASIO buffer size at minimum, where only live monitoring is still forced to be in real time (and during mixing and mastering there is no live input at all).
 
2018/04/24 21:27:58
The Maillard Reaction

2018/04/25 15:34:23
iRelevant
azslow3
I do not see PDC as complex. Plug-in (i) reports that it will create output for the input with delay X(i). After summing, each strip output is delayed from the input by (X0+X1+X2...Xn). When these outputs are mixed into some bus, "faster" strip should wait for "slower strip", so the mix is calculated from the output corresponding to the same original input. The bus can it turn add more delay, so summing different buses should be also synchronized.
 
F.e.
 
            |-> Bus 1 (0 delay)     -->  |
Track    |                                      | --> Mix to Bus 4 (0.2s delay)---|
            |-> Bus 2 ( 1s delay) --->  |                                              |
            |                                                                                     | --> Mix to Bus 5
            |-> Bus 3 (0.3s delay)   -------------------------------------|
 
To mix things correctly, we need to:
* add 1s delay to Bus 1
* add 0.9s delay to Bus 3 (Bus 4 output is 1s + 0.2s, original Bus 3 delay 0.3, 1.2-0.3 = 0.9)
 
There are many ProTools videos where these "extra delays" should be added manually...
All (?) DAWs can add them automatically now.
 
Note that "Track" can be one or many since original material is assumed to be "in sync".
 
Sonar route audio in "real time", that means you hear the audio after 1.2 seconds it has entered the engine. That is problematic for live monitoring, so there is an option to disable the compensation: "Live input PDC Override". But that is also a problem for mixing, if you change some parameter of Bus 3, you will hear the result not before 0.9 later. So "mastering" plug-ins (which introduce such huge delays) should be avoided on any intermediate tracks/buses.
 
Some DAWs (Reaper, S1, other ?) use a "trick". They can "pre-calculate" a part of routing tree without live inputs. So the output of any such (sub)tree has always 0 (zero) effective delay, independent from which plug-ins are used. In this case, "PDC Override" is not required. In the example, Bus 3 changes will be delayed at MOST by 0.3s, so by own delay which can not be avoided.
The trick has yet another effect: if we pre-calculate ALL recorded material (f.e. 50-200ms), the processing is not strictly bound to the "real time" required by ASIO to avoid under-run. So it is possible to have the same "stability" as with huge buffers (2048, 4096 up to "normal" Windows Audio huge buffering) while keeping ASIO buffer size at minimum, where only live monitoring is still forced to be in real time (and during mixing and mastering there is no live input at all).
 


Thanks for your post Alslow3. Very informative. 
 
This pre-calculation trick is interesting, first time I hear about it. I guess there are limitations though, even though there is no live audio input during mixing and mastering ... I would consider recording of automation data to be a real time live input.
 
Not all contemporary DAW's have fool proof APDC btw ... I won't mention name(s?).
2018/04/25 18:51:52
azslow3
iRelevant
This pre-calculation trick is interesting, first time I hear about it. I guess there are limitations though, even though there is no live audio input during mixing and mastering ... I would consider recording of automation data to be a real time live input.

Automations and visualization from plug-ins is influenced by pre-calculation. F.e. you first see the signal in EQ... and only hear it after a while
But in practice around 20ms in such cases is not a problem, human operations (except "automatic", f.e. drumming gestures) have huge "latency", in addition our brain easily adopt to natural range delays. Compare singing with 20ms latency in headphones (most people are sensible to over 5ms delays in such case) with playing MIDI keyboard with such latency (some people think they also need 5ms here, while they use MIDI connected keyboards... and it takes 10ms to transfer 10 finger chord).
 

Not all contemporary DAW's have fool proof APDC btw ... I won't mention name(s?).

From what I could see in the Internet, most problems are coming from plug-ins which do not report the delay correctly.
12
© 2025 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account