• SONAR
  • MIDI "Jitter" - It Does Exist (p.20)
2007/10/12 17:08:21
brundlefly
You should definitely try a VSTi as well since the mechanics are different.


I'll see what I can do. Thanks for looking at this in any case.
2007/10/12 17:10:18
dewdman42

ORIGINAL: brundlefly

the beginnings of transients should be at even intervals (i.e. all early or late by the same amount. But I am seeing differences on the order of 100 samples from the expected interval between consecutive events when rendering through the TTS-1.


Ok, I want to understand your test better. I didn't get a chance to try to replicate it last night. Maybe over the weekend. But when I tried my simple test, I just looked to see if the midi events in the PRV were exactly lined up with the audio events, down to sub-ms precision. And they were. However, I did not actually attempt to check the number of samples between audio transients to see if the track view was lying to me. That sounds kind of like what you are saying...that regardless of what you see in the trackview, the transients that were frozen into the track were not evenly spaced as you should have expected. You're absolutely sure you're going from the exact start of each transient? 100 samples is more than 2ms, so not insignificant.

I will have to take a look at that this weekend, but i would also like to hear what Cakewalk has to say.


2007/10/12 17:13:07
dewdman42

ORIGINAL: Noel Borthwick [Cakewalk]
It could very well be a bug in the synth. I'll try and investigate your post when I get some time. You should definitely try a VSTi as well since the mechanics are different.


So the fact that this is a possibility, leads me to believe that we are somewhat at the mercy of our plugins to know for sure they will not screw up the playback timing. Can you please elaborate more on the details involved? I would have thought that all plugins are supposed to report a fixed latency, which Sonar can then compensate for. Are you saying that some plugins may be reporting more or less latency then they are actually producing, which is then causing timing discrepancies when sonar applies delay compensation?


2007/10/12 17:18:47
RTGraham

ORIGINAL: brundlefly

16. If the project is set up as described (and you run 44.1k sample rate like I do), the nominal distance between two drum hits should be 1378 samples. If you get the same result I got, you will find many intervals of 1407 to 1414, with every 4th or 5th one being 1274. If you run a different dample rate, just scale the numbers accordingly.

1274 is 104 samples (4.5ms) short of what it should be, but because all the other intervals are a little too long, things get close to being back on track every 4th or fifth pulse. To me, this looks like a typical case of some sort of cyclic mathematical MODding. Could be a programming error in the rendering algorithm of the TTS-1, or somewhere else in Sonar. I don't know. I haven't yet tried this with another soft synth.



Very, very interesting.

Did you happen to make a timing list of the sample intervals, and compare from beat to beat, and from measure to measure, to see if there was a consistent pattern?

I became aware, from the test I did a couple of years ago, of hardware sequencers and drum machines that, while they had no MIDI "jitter," did not have truly equal sixteenth notes. There was a definite pre-programmed "feel" to the sixteenth-note quantization, and a pattern of larger and smaller sample intervals between consecutive sixteenth notes, and what you're describing sounds very similar. I wonder how many drum machine manufacturers do this, without advertising it - and then certain music producing communities (i.e. hip-hop) swear on their mother's macaroni and cheese that a specific drum machine / sampler / sequencer (i.e. MPC3000) just *feels* better than anything else.

Of course it does - it's been *groove* quantizing, not absolute quantizing.

(I should note that the MPC3000 was *not* in fact one of the units I tested - I only mention it here because it's just a good illustration of a hypothetical scenario.)

Back to the matter at hand - when I had run these tests before, I did not find SONAR doing the same "covert groove-quantizing," but given the relationship with Roland (including the origin of the TTS-1), I wonder if there's any possibility that what you're seeing is the same thing.

Or, as Noel suggests, it could just be a bug in the softsynth.
2007/10/12 17:32:56
RTGraham

ORIGINAL: dewdman42

My feeling is that playback through VSTi's should absolutely be intertwined with audio sync and should be 100% jitter free. If it is not, as some people seem to be claiming, then Cakewalk has a lot of work to do and I'm not impressed. I haven't noticed any problems here, but some people appear to have.

...

There is no excuse for this when playing through soft instruments since everything is being rendered by the Sonar audio engine, which is supposed to know all about audio buffers, delay compensation, etc.. There is absolutely no reason for the midi timing in that case to not be 100% perfectly tight.



Agreed. I'm glad to see that Noel chimed in on this thread - perhaps, as he suggests, there's some kind of softsynth bug coming into play. When I freeze softsynths from quantized MIDI tracks, I don't see any problem with timing. The only exception I've seen is with softsynths that have known issues with Fast Bounce - if I Fast Bounce BFD or Akoustik, I end up with all kinds of strange audio artifacts, notes in the wrong measures, etc. If I disable Fast Bounce the problems go away. Fairly frustrating, especially when freezing 15 tracks of drums on a 5-minute song, but that's not the issue at hand here.

ORIGINAL: dewdman42

The problem that comes up related to midi drivers and recording from a midi keyboard is probably not solvable and we have spent a great deal of time discussing it in this thread and sharing some interesting facts, but its a limitation of windows. At any rate, my perception is that its not actually that bad and that most of the complaints people have are timing differences much much greater than 1-2ms, but rather large and blatantly obvious timing glitches at playback time.



Yes, most of the complaints seem to pop up when there are large and blatantly obvious timing glitches. But as we're seeing in this thread, as people become more informed about the issue, they recall situations in their own experience (i.e. "Guess what - my timing is actually OK when I just record audio instead of MIDI") where the MIDI "jitter" issue has affected them. I think that now that audio stability has reached a certain threshold, it might be reasonable to expect that enough people will be interested in their MIDI being stable as well for it to become both a solvable issue and something that manufacturers are interested in accomplishing.

ORIGINAL: dewdman42

But short of just bringing this up all the time, I'm not sure Cakewalk will take action, it doesn't seem to be something that people complain about very often.



I don't know that it's something that Cakewalk can take action on by themselves; it's deeper than just the sequencer. It seems like something that has to happen at a manufacturer level (Jim Wright has suggested that a MIDI interface could be built that references its own, more stable clock, and could also clock to an external stable clock, like word clock, and achieve much greater internal MIDI stability while still being able to reference Windows' current driver APIs), and which, once accomplished, would benefit not only SONAR users but *all* Windows DAW users. I don't know if this carries across to the Mac platform software; I've never done similar testing on the Mac side, so I don't know how much MIDI "jitter" affects Mac MIDI communications.

This also is the first thread I have seen on this issue on this forum in a *long* time - perhaps ever - that has gotten this involved, with this much interest and attention, and especially with this many good ideas and potential solutions being discussed.
2007/10/12 17:35:06
dewdman42
I would not expect sonar to be doing irregular quantizing on purpose. Quantizing is quantizing, unless it is specifically groove quantizing or humanizing, which is a different thing entirely. I think for this test we can assume the midi data is quantized and we would expect the intervals to be exactly the same.

The same test should be run through a variety of DXi and VSTi plugins to see what kind of results we get.
2007/10/12 17:36:12
RTGraham

ORIGINAL: pianodano


ORIGINAL: Jim Wright
<snip>

I have some ideas about how to build a MIDI interface that uses robust timestamps synchronized with an audio sample clock, but some custom chip design would likely be required. If someone with chip-design chops would like to collaborate on an open-source hardware/software project to do just that -- let me know.

- Jim


I hardly believe what I have read here: that midi is not synchronized to the audio via the sample clock. How did that idea become the standard? What is the SOURCE of the MTC that is generated and output ??

Danny


Keep in mind that when MIDI was first implemented, manufacturers were not thinking specifically about synchronizing it with audio. The first MIDI sequencers, and then the first computer MIDI sequencers, were implemented just to play back synths and drum machines. Audio was added much later, after "satisfactory" methods of MIDI clocking had already become the norm. It happens that audio required significantly more stable clocking solutions - it's much more audibly obvious when there's an audio clock problem - and so by comparison now, we can see how far behind the curve MIDI clocking is in terms of accuracy and stability.
2007/10/12 17:39:53
RTGraham

ORIGINAL: Jim Wright


BTW - I have some ideas about how to build a MIDI interface that uses robust timestamps synchronized with an audio sample clock, but some custom chip design would likely be required. If someone with chip-design chops would like to collaborate on an open-source hardware/software project to do just that -- let me know.

- Jim


Wow.

I don't have those kind of design chops, nor do I know anyone personally who does; but as I communicate with people in the industry, I'll certainly keep my ears open and mention it if it seems like the issue is "discuss-able." It would be great to see a new standard of MIDI interfacing; since we've seen that audio can achieve that kind of stability, it should be possible with the MIDI data stream as well.
2007/10/12 17:50:13
brundlefly
You should definitely try a VSTi as well since the mechanics are different.


Indeed it seems they are. The only VSTi I have is TruePianos. I thought it might give me trouble finding transients given the long decays of piano notes, but it turned out not to be a problem; AudioSnap found them all, and marked them lal in exactly the same place. That place was a little way into the visual start of the transient at 300 samples behind the MIDI event, but... Drum roll please...

Every interval was identical, and equal to the expected value (2940 samples for 64th notes at 56.25BPM)!

It remains to be seen whether TruePianos superior performance is inherent to VSTi technology or just that they've got a better timing algorithm. If someone can turn me onto a freeware, downloadable VSTi somewhere, even a short-term demo like the TruePianos installation I'm using, I'll test it.
2007/10/12 18:11:46
RTGraham

ORIGINAL: brundlefly
Every interval was identical, and equal to the expected value (2940 samples for 64th notes at 56.25BPM)!


Cool beans... so we're seeing that at least under certain circumstances, rendered softsynth MIDI data is stable as expected.
© 2026 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account