• SONAR
  • How to keep outboard synths and VST's in sync?
2015/04/15 08:37:20
Dan_E10
Hi all,
I'm have a modular analog synth that I'm using to supplement soft synths.  I've noticed that when playing the modular from midi tracks in Sonar X3, it plays just slightly after vst's.  When I record the modular to an audio track, delay compensation works correctly and the recorded audio clip lines up perfectly.  However, it gets to be a pain to have to bounce tracks to get them to line up with the vst's when I'm still in the process of editing the midi tracks.
 
It seems that for the most part the midi offset delay in preferences works well to get the outboard synth to line up with the vst's during playback.  However, I've noticed that if I switch projects, the offset that worked for the first project now doesn't work at all for the second project.  The modular is again out of sync with the vst's.  I've found that the delay I need to add also varies with the ASIO buffer used.  Reducing the ASIO buffer to the minimum (2 ms) helps somewhat, but it's still not perfect and I get more drop outs at that setting.
 
From reading other threads here such as the midi jitter thread, I'm guessing this is one of those complicated "many links in the chain" problems where midi and audio drivers as well as the OS all affect the outcome.  Anyway, I'm wondering if anyone here has some tips on how they keep their outboard and virtual synths in sync.  Does it usually just work, or do you have to do some testing with each project/session to determine the midi delay you need to use?
 
I'm on Win 7 64 bit using an old Emu 1820M by the way.  The 1820M only has beta drivers available for Win7, but they work for me more or less.  I don't have an external clock, the 1820M is providing all timing.  Sometimes I do feel like the midi clock drifts slightly from the audio clock as well, but this isn't repeatable for me.
 
Dan
2015/04/15 08:55:02
lfm
I use Metaplugin for that - making a general movement of a synth as I calculated it needs to line up.
 
http://ddmf.eu
 
Among the best $50 I spent, apart from Sonar Artist.
 
What I do is let the gui be as default, all audio just run through.
Then set the field for latency to whatever samples you need to report to host.
As I remember it's about 350 samples for my Hammond.
 
So setting this Metaplugin on a track your report 350 samples, but no such delay is performed.
Meaning a recorded clip will be put 350 samples early by host.
 
I usually render external instruments early, after recorded - not having to keep external gear on while finalizing it all.
 
EDIT: Forgot about the Time+ field on midi tracks. There you can adjust to make audio for that track to line up. So much simpler in Sonar than many hosts.
2015/04/15 09:23:24
tlw
I generally find things "just work", or very nearly. How far out is the audio compared to the MIDI?

Hardware synths take differing amounts of time to respond to MIDI, so I might need to slightly nudge audio to get spot-on agreement with the MIDI source track but it's a very few milliseconds at most and usually just a few samples.

I run at a 48 sample audio buffer and use Sonar's track echo function for monitoring by the way. That's around a 5ms round trip for the audio.

Things that might mean sync varies between projects include using plugins at the tracking stage that require plugin delay compensation. That's generally anything with a look ahead function and convolution reverbs. Another issue might be if you are daisy-chaining hardware using MIDI thrus. As each thru will delay the MIDI a little and MIDI is a serial protocol so data intended to each synth in the chain will get sent to one synth then the next etc. that can also affect MIDI timing. Finally, no MIDI clock is perfectly stable and like any MIDI application in Windows, Sonar's clock does wander a little.

If the MIDI sync is consistently out there's a setting in preferences you can adjust to have Sonar automatically compensate. In my case it's not much use as I have several synths connected each of which has a slightly different MIDI processing time plus I've two MIDI interfaces which also slightly differ in processing time.
2015/04/15 10:15:19
dcumpian
Can't say I've ever noticed any variance between the timing of VSTi's and external synths while playing back midi tracks. Something doesn't sound like it is working correctly. See if you can borrow another midi interface and see if that tightens up your timing with the VSTi's. Not all midi interfaces work well, particularly if it is an afterthought to an audio interface.
 
Regards,
Dan
 
2015/04/15 10:18:38
John
I use VSTis and hardware synths routinely and can't say there has ever been any sync problems. It just works. 
2015/04/15 10:23:23
bitflipper
It sounds as though your problem might be that you're monitoring the synth through SONAR, where the lag between generating a MIDI note on your controller and the synthesizer responding is going to be variable due to different latencies in different projects.
 
If that's it, then the solution is to monitor your synthesizer in real time. Most audio interfaces feature a so-called "zero-latency" monitoring mode, wherein its analog input channels are routed directly to the interface's output or headphone outputs. Check your interface's documentation.
 
2015/04/15 10:53:14
brundlefly
The 1820m has very low MIDI transmission delay compared to most USB interfaces, so assuming the synth has a decent response time (usually no more than a few milliseconds for modern hardware), any sync issue should be mostly due to audio latency. The Timing Offset can be used to address this, but can have undesirable side-effects since MIDI grid is offset for recording as well as playback. Ideally you should just keep your ASIO buffer as low as possible, and it shouldn't be a significant problem.
 
With ASIO buffer at 2ms, the RTL will be under 6ms and the MIDI transmission/response time will be adding maybe 5ms to that. Soft synths will be subject to the same outbound audio latency, so the discrepancy should only be on the order of 8ms, which is more on the order of a large phase error than a timing error. Bumping the ASIO buffer to 3-4ms would only add another 1-2ms to the discrepancy.
 
if the timing is really audibly off, then something else is going on like PDC. If you're using FX plugins that require PDC, you'll need to enable the PDC Override for live inputs so the hardware synths aren't affected. This really the only thing I can think of that would account for differences between projects too.
 
I loved the performance, flexibility and sound of the 1820m, but I had pretty significant problems with the beta x64 drivers.
2015/04/15 11:21:57
Dan_E10
tlw
I generally find things "just work", or very nearly. How far out is the audio compared to the MIDI?

I seem to need to use a timing offset of between 0 and 80 ms in the "Full Chase Lock" synchronization setting.  It varies by project and also by ASIO buffer size.  The minimum buffer I can choose is 2 ms but I need to use about 8 ms to avoid pops and crackles in the audio.

tlwHardware synths take differing amounts of time to respond to MIDI, so I might need to slightly nudge audio to get spot-on agreement with the MIDI source track but it's a very few milliseconds at most and usually just a few samples.

I run at a 48 sample audio buffer and use Sonar's track echo function for monitoring by the way. That's around a 5ms round trip for the audio.

Things that might mean sync varies between projects include using plugins at the tracking stage that require plugin delay compensation. That's generally anything with a look ahead function and convolution reverbs. Another issue might be if you are daisy-chaining hardware using MIDI thrus. As each thru will delay the MIDI a little and MIDI is a serial protocol so data intended to each synth in the chain will get sent to one synth then the next etc. that can also affect MIDI timing. Finally, no MIDI clock is perfectly stable and like any MIDI application in Windows, Sonar's clock does wander a little.

The plugins used in one of my test projects were the Sonitus delay and BreVerb.  Perhaps things were not equals between the tracks used for the outboard synth and the soft synths.  I am not needing to chain multiple midi thru's yet, but I will soon.  I'm sending midi out to a midi-CV converter which is monophonic.  I plan to add a couple more channels with a second midi-CV converter so I'm hoping that doesn't add some more sync headaches.
thanks,


Dan
2015/04/15 11:26:27
Dan_E10
bitflipper
It sounds as though your problem might be that you're monitoring the synth through SONAR, where the lag between generating a MIDI note on your controller and the synthesizer responding is going to be variable due to different latencies in different projects.
 
If that's it, then the solution is to monitor your synthesizer in real time. Most audio interfaces feature a so-called "zero-latency" monitoring mode, wherein its analog input channels are routed directly to the interface's output or headphone outputs. Check your interface's documentation.
 




Thanks bitflipper.  I am monitoring through Sonar.  I'm kind of stuck doing this since I'm using delay and effects a lot at the composition stage.  A zero-delay monitoring mode would bypass all fx applied in Sonar correct?  The midi tracks I'm working with right now have been input with the staff view and PRV so any midi controller contributions to the sync shouldn't be a factor I'm hoping.
Dan
2015/04/15 11:29:28
Dan_E10
lfm
EDIT: Forgot about the Time+ field on midi tracks. There you can adjust to make audio for that track to line up. So much simpler in Sonar than many hosts.




I wasn't even aware of this!  This may be a much easier to use option than the timing offset field I've been modifying in preferences.  I'll have to try this.  Thanks!
Dan
© 2026 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account