• Hi and welcome to the Studio One User Forum!

    Please note that this is an independent, user-driven forum and is not endorsed by, affiliated with, or maintained by PreSonus. Learn more in the Welcome thread!

Are Daisy-Chained Virtual Instruments not Plugin Delay Compensated?

Hey guys, quick sanity check here...

I'm realizing that when a MIDI line is recorded, playback from an instrument which is feeding it's output into another instrument's input will not be PDC-compensated, causing it to arrive on the beat late by the amount of samples incurred by the interface buffer size.

Is this correct?

How do you guys deal with this, just make an instrument track with enough delay offset to compensate?

Thank you!
 
Last edited:
Hey guys, quick sanity check here...

I'm realizing that when a MIDI line is recorded, playback from an instrument which is feeding it's output into another instrument's input will not be PDC-compensated, causing it to arrive on the beat late by the amount of samples incurred by the interface buffer size.

Is this correct?

How do you guys deal with this, just make an instrument track with enough delay offset to compensate?

Thank you!

Just for the sanity clause, are you talking SO1 to hardware and back or inside the box with Vsti's?
I ask as my brain went off in all sorts of directions.:unsure:

Best regards
 
Just for the sanity clause, are you talking SO1 to hardware and back or inside the box with Vsti's?
I ask as my brain went off in all sorts of directions.:unsure:

Best regards

In my case, strictly chaining VSTi into each other's INs and OUTs ITB.

Sorry for tipping the psychological dominoes over

😁
 
Dominoes...It's OK, it happens all the time these days :ROFLMAO:

I was wondering around trying to work out if Midi timestamps were a part of the omlette.
Do they get created by the sender or the receiver, also are they stripped and replaced etc.

I think the nail is well struck by trying to calculate any offset. Its a good question tho'

Kindest regards.
 
I highly suspect there is some plugin out there that simply calculates the current plugin latency and adds it to the samplerate buffer roundtrip latency. In theory you could just pop that on the plugin output and Bob's your uncle. Eventide has something that ties directly into PDC and does something similar called 'Precision Time Align', and it does negative delays but only up to about 10ms.

Although now that I think about it, might not be possible due to requiring too much lookahead to effect a negative offset...

Just one of those things!
 
TD.
I set up Pigments to run a basic sequence 16 note sequence, "switched off its Osc's and started to load in sets of vst's to receive the output from pigments. I used groups of 4, of each synth I loaded, just to put a bit of stress into what I was doing.

With AQ and snap set to on, I recorded the note output of Pigments to the synths. I was surprised when *zooming* in that the notes where not
aligned right on point (Auto Quantize). Switching the rule to samples it was around 16 samples early with just 4 Vital synths.

Added in another batch of 4 vst's wash and rinse and the the position of the notes were placed at another point on the timeline.
8 synths running now. (4 Vital, 4 Modwave native)
Added another 4 vst's, again a different result ?
12 Synths running (4 Vital, 4Modwave, 4 Omnisphere) Notes again in another position.

Is there some background midi function compensating for plugin latency and placing notes?
All of the notes were aligned to each other, but the position moved on the timing grid when a different flavour of synth was added.

This is just an observation, I don't know what is going on or if this is just a function of more computing cycles etc.
Thought you might be interested tho'.

Best of regards
 
Last edited:
I feel like we need Lukas or someone to chime in.

Not really or someone, just Luke.

@Lukas

To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
 
How is the performance if you use a multi? If you're sending the same midi to multiple instruments anyway, this might work better
 
How is the performance if you use a multi? If you're sending the same midi to multiple instruments anyway, this might work better
@BobF
That's well beyond my pay grade. :)

It would be interesting to have an answer tho'. Is there any inter plugin compensation when used in this fashion as the OP has asked.

Best regards
 
Yes PDC is broken, but It is not related to buffer size. It is related to the latency of the plugins in the song.
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
 
How is the performance if you use a multi? If you're sending the same midi to multiple instruments anyway, this might work better
VSTi's in Multi Instruments can not be serial, they can only be in parallel.. unfortunately.
 
Last edited:
Just for clarification,
I did not mention or infer anything about midi jitter. I understand what midi jitter is and it had nothing to do with midi jitter?
The acronym is Plugin Delay Compensation "PDC".

Kindest regards
 
sintil8, then I fail to understand your post #6. (Edit: Or maybe I don't understand TonalDynamics posts. There's no screenshot, routing info or Vsts used.)

For instance, you're not adding Midi Vst (Midi in, Midi out). I don't know Pigments but Omnisphere doesn't output Midi.
 
Last edited:
@Jens
Pigments can be used as a midi sequencer and outputs midi notes.

Pigments is feeding all of the other instruments. One source of output to all of the added Vsti's
I tried to see if the Vst's would introduce a variety of latencies etc the Auto Quantize when switched on surprised me as the notes were being placed off grid but at the correct time intervals apart 1/16.

I switched snap off selectected all and moved the notes as a group, I was able to place the notes on grid as a group...No jitter?

I had also done the same without A/Q and had a variety of different results for that run through.

This was just an exploration of the question, not making any judgement or statements!
Hope this helps your understanding. I see this as a moot point from here on in.

Kindest regards.
 
Yes PDC is broken, but It is not related to buffer size. It is related to the latency of the plugins in the song.
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
Just the confirmation I was looking for. Thanks for making that!

Sanity restored.

That said I believe this is a loooong standing bug, pretty sure I've observed this behavior since version 3 or some such.

So Daisy-Chaining Virtual Instruments has never been 'fixed' basically.
 
@TonalDynamics

I'm still watching the dominoes fall. :unsure:
Is that not the expected outcome. The YTube has purposely introduced 500 milliseconds of latency into the project.
When putting the quantized midi event onto the VeloScaler track, then recording into the sampler track the notes are placed a 1/4 note late.
Is the daw compensating to align the notes a 1/4 late, so it's adjusting the recorded midi notes to line up with latency of the Audio channel (or the project latency), which 500 milliseconds at 120bpm = 1/4.

Is their any measurement between the Veloscaler and the sampler, or have I misunderstood. Is this measuring the combined PDC of the vsts of the linked channels, or adjusting the midi to the latency of the project?
If I am looking at this incorrectly please correct me, I am only human and do err at times.

Best of regards.
 
Don't get hung up on the 1/4 notes. This is just a result of an unfortunate choice of numbers on my part. I shouldn't have used 500 ms latency at 120 bpm. The issue is not about how much is being compensated, it's about when and how the audio/midi is compensated between the two instrument tracks.
 
FMN-Music, if you want to prove that a Midi Vst (Veloscaler) causes the problem then a comparison take without Veloscaler is needed.
 
Don't get hung up on the 1/4 notes. This is just a result of an unfortunate choice of numbers on my part. I shouldn't have used 500 ms latency at 120 bpm. The issue is not about how much is being compensated, it's about when and how the audio/midi is compensated between the two instrument tracks.
I will take your word on the matter, however would it be possible to use a different choice of numbers that can clearly show this is not a function of the project latency making the adjustment.

It would be good to have a definitive answer for the OP and others following this question.

Best regards to all.
 
Back
Top