• SONAR
  • MIDI "Jitter" - It Does Exist (p.34)
2007/10/19 15:59:15
jb

ORIGINAL: Steve_Karl

http://www.sightsea.com/music/singles/safety's_beginning.mp3


Steve, nice music.
2007/10/19 16:01:55
RTGraham

ORIGINAL: Steve_Karl
In reality, my experience is that 960 ticks in Sonar is much more accurate to a human performance than 96 is,
An MC-500 Mark II would be intolerable for me in this stage of the work I do.

Jitter is a non-issue for me. I don't see it, I don't hear it. My rig is stable and the playback is reliably consistent.
I'm a happy camper at 960.
Now ... It's quite possible that the "slop" cause by jitter that you refer to is something I've gotten used to, and have learned to work with.
Kind of like the way we learn to work with the difference in the height of guitar strings, or the throw of a piano key.
Listen and adjust, and forget about it.


Good point. And I should probably note here that for the most part, I've been very happy with 960ppqn as well - to me, it definitely feels like an improvement over older, lower-ppqn software sequencers. Most of the time, I'm happy with how SONAR plays back what I record... but I do find myself doing tweaking in surprising spots occasionally (especially with things like cymbal rolls).

ORIGINAL: Steve_Karl
I make a point of keeping my tempos set high, as in, if the piece is really at 90 BPM, I'll intentionally run it at 180.
I usually don't worry about the tempo unless I'm getting down below 100 BPM.


Interesting. I know dewdman says this only makes your computer work harder, but I wonder if you might be effectively running at 1920ppqn (kind of like optical resolution versus digital resolution on a scanner). In theory, you're achieving a better, more accurate *feel* by recording with higher precision when the timestamps are accurate, but it probably also means that the "slop," which is millisecond-timer-related (as opposed to MIDI sequencer clock related), spans twice the number of clicks when it does rear its head.

Interesting, musical post.
2007/10/19 16:04:08
dewdman42

ORIGINAL: RTGraham
but it probably also means that the "slop," which is millisecond-timer-related (as opposed to MIDI sequencer clock related), spans twice the number of clicks when it does rear its head.


exactly
2007/10/19 16:10:59
RTGraham
ORIGINAL: dewdman42


ORIGINAL: RTGraham
but it probably also means that the "slop," which is millisecond-timer-related (as opposed to MIDI sequencer clock related), spans twice the number of clicks when it does rear its head.


exactly


Right... however, I can see how doubling the tempo might still be useful to Steve... if he happens to be using an extremely stable MIDI interface, where he's already minimizing his jitter, then he'll be capturing the subtleties of his performance with finer resolution overall, jitter notwithstanding. Wonder what interface he's using.

Which also makes me wonder... I know a lot of people who have old Emagic Unitor-8 interfaces that they never gave up, even though they're out of production, because they feel they're extremely stable. I'm talking about musicians who are serious about their timing... I wonder what kind of jitter characteristics those interfaces have. Granted, some of them are being used in Mac environments, so the entire communication mechanism might be different.

EDIT:

Just did some searching on the Unitor-8 interfaces... Emagic had incorporated, at the time, a new "technology" called AMT. It sounds, essentially, like hardware timestamps that get decoded only by the same manufacturer's software -in this case, Logic. Being that Apple now owns Logic, one could hypothesize that they incorporated some of those concepts into CoreMidi, which would make the Unitor a particularly stable interface on the Mac now, regardless of what application (i.e. Pro Tools) accesses it.

An interesting scenario, if it's true. And an argument for Microsoft to do something similar with Windows: based on all of the posts in this thread, it's becoming clear (at least to me) that one solution, and probably a good one, would be for Windows to incorporate a kernel-level MIDI API that would support better clock resolution and timestamping.
2007/10/19 16:19:35
jb
It's discussed here a couple of mouse wheels down.
2007/10/19 16:24:10
dewdman42
The emagic unitor provided hardware timestamping, but only with Logic. It will not be more stable than anything else under sonar.

If you have 2-4ms of slop when recording your midi parts, then it does not matter what the resolution is set to..it only means it will record the slop more precisely... more precisely recorded slop! heh heh.. Its futile to think its giving any more accurate precision than that. and there is the possibility that the increased work load going to the computer to handle more ticks per second will make the slop worse.

Some key strikes might get lucky, some may not. You would have no way of knowing which ones were on the money and which were slopped out. Everyone has been measuring and the best we're seeing is in the average area of 2ms off reality.

As I said, if you want to capture a gliss or grace note, try the higher resolution, maybe you will get lucky maybe not. If you're just trying to play to tempo, using a lower PPQN is more likely to get the note where you wanted it and will impose less system resources to do so.



2007/10/19 16:27:11
Steve_Karl

ORIGINAL: RTGraham

Right... however, I can see how doubling the tempo might still be useful to Steve... if he happens to be using an extremely stable MIDI interface, where he's already minimizing his jitter, then he'll be capturing the subtleties of his performance with finer resolution overall, jitter notwithstanding. Wonder what interface he's using.



Originally I had a Voyetra V24 ( I believe it was called ) on the first machine I used CWPA9 on.
It was very good.

Now I have 2 TASCAM pci-822 cards ( pci slot ... in 2 different machines )
No complaints at all.








2007/10/19 16:33:00
dewdman42

ORIGINAL: RTGraham
Just did some searching on the Unitor-8 interfaces... Emagic had incorporated, at the time, a new "technology" called AMT. It sounds, essentially, like hardware timestamps that get decoded only by the same manufacturer's software -in this case, Logic. Being that Apple now owns Logic, one could hypothesize that they incorporated some of those concepts into CoreMidi, which would make the Unitor a particularly stable interface on the Mac now, regardless of what application (i.e. Pro Tools) accesses it.

Apple licensed MOTU's MTS technology for CoreMidi. In theory, MOTU midi devices with MTS will provide hardware timestamping to all CoreMidi applications, including Logic and Digital Performer. Prior to CoreMidi, only Digital Performer could take advantage of this hardware timestamping feature.

On the windows side the only options have been the Unitor with AMT, and as well Steinberg had a solution that only worked with their sequencer.


An interesting scenario, if it's true. And an argument for Microsoft to do something similar with Windows: based on all of the posts in this thread, it's becoming clear (at least to me) that one solution, and probably a good one, would be for Windows to incorporate a kernel-level MIDI API that would support better clock resolution and timestamping.


Now you're getting it. This is what I've been trying to say for days. Microsoft is the one to bug about this. They need a Windows equivalent of Apple's CoreMidi...and then you have to convince hardware makers to build the same sort of technology into their interfaces that MOTU did in theirs. Ideally, Microsoft would mimic Apple and license MOTU's MTS. That way, all the MOTU hardware would instantly become useable under Windows sequencers that use the new model. However, I doubt MS will ever copy Apple in this regard. Also, MS has had DirectMusic out for a long time now, which had a lot of similar technology a CoreMidi and absolutely nobody in the windows world jumped on that bandwagon except for Steinberg and that guy with the 8PortSE driver. There is just not a lot of drive in this direction.

Personally I think Cakewalk has a lot of influence over Microsoft audio/midi technology and I wish they would take a more active role in terms of advancing sub-millisecond midi timing accuracy.

2007/10/19 16:39:10
dewdman42

ORIGINAL: Steve_Karl
Now I have 2 TASCAM pci-822 cards ( pci slot ... in 2 different machines )
No complaints at all.


That is one of the few PCI based midi interfaces out there and I would be very interested to hear the results if you were to run this interface through some of the tests that others on this thread have done. PCI based midi does have the potential to be lower latency and less jitter than USB in general, though you're still dealing with the MM timer which is 1ms on a good day, but can sometimes be sloppy too. But still I would expect that you might very well be getting pretty close to 1ms timing with that interface, just on a hunch..(which is the same as 480PPQN by the way)...not including the other midi slop you get from the midi cable and midi ports on your keyboard.

2007/10/19 17:10:13
brundlefly
If you're just trying to play to tempo, using a lower PPQN is more likely to get the note where you wanted it and will impose less system resources to do so.


Okay. I'm officially taking a stance: This is NOT true.

I set up a new Sonar project at 48PPQ, saved it as a template, closed Sonar, restarted, opened the template, and recorded some randomly played notes from my keyboard. In the event view, the first four events were at ticks 37, 3, 19 and 10. This would correspond to 740, 60, 380 and 200 at 960PPQ. But when I changed the project to 960PPQ, the displayed tick values were 742, 73, 392 and 201.

Based on this, I feel it is safe to say that Sonar always uses a resolution of at least 960PPQ internally, so there is no value in running at a lower PPQ setting, except possibly if you like the tick values to have a smaller range when you are looking at things in the event view.


© 2026 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account