pianodano
Max Output Level: -67 dBFS
- Total Posts : 1160
- Joined: 2004/01/11 18:54:38
- Location: Va Beach Virginia
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 18:01:13
(permalink)
ORIGINAL: Jim Wright <snip> I have some ideas about how to build a MIDI interface that uses robust timestamps synchronized with an audio sample clock, but some custom chip design would likely be required. If someone with chip-design chops would like to collaborate on an open-source hardware/software project to do just that -- let me know. - Jim I hardly believe what I have read here: that midi is not synchronized to the audio via the sample clock. How did that idea become the standard? What is the SOURCE of the MTC that is generated and output ?? Danny
Best, Danny Core I7, win XP pro, 3 gig ram, 3 drives- Lynx Aurora firewire- Roll around 27 inch monitor, 42 inch console monitor- Motif xs controller - Networked P4's and FX Teleport for samples- Muse Receptor VIA Uniwire for samples and plugs- UAD QUAD Neve - UAD 1- Sonar X1 but favor 8.5 GUI - Toft ATB 32 - Vintage hardware - Tascam MS-16 synched via Timeline Microlynx -Toft ATB32 console
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 18:46:08
(permalink)
Pianodano: MTC is the midi equivelant of SMPTE. Research those to find out more. Not related to audio at all. Jim, I don't know anything about hardware or chips, but this is a great idea and sorely needed. If you can get anyone to design the hardware, let me know and I can help with the drivers. I personally think that if someone made an affordable midi interface with timestamping in the hardware and all the correct syncronization built in to work with audio, midi and video setups....and all DAW's universally....well...that would become the standard that everyone would follow. They should have done it like that to begin with. I believe some of the MOTU stuff does some what you are talking about in terms of sample accurate sync....but I think it might only be their high end video related stuff like the digital timepiece that does it. I just checked the back of my Motu midi timepiece. Admittedly, this is the older one, perhaps the new USB one has more stuff, but all I see on the back is a video sync in and a word clock out. I will have to read the manual to find out how that is designed to work. Word clock is really more like a sample rate metronome...so I'm pretty sure the word clock out is just to make sure that my DAW is sample accurate... Anyway, back to Sonar...... This weekend I am going to try to run some tests where I freeze tracks to audio and compare the midi notes to the waves I see on the audio track to see if there is any jitter happening during freeze or mixdown. After that i will try to capture the audio that is playing when I am just hitting play and playing midi tracks through soft synths. then I compare that audio track to the midi track to look at the waves and try to determine if there is jitter happening. Those two situations are what I am most concerned about right now. If Sonar is screwing that up, then Cakewalk needs to get busy....
post edited by dewdman42 - 2007/10/11 20:24:51
|
Tonmann
Max Output Level: -77 dBFS
- Total Posts : 661
- Joined: 2005/07/27 06:59:22
- Location: Kiel, Germany
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 19:11:27
(permalink)
From my technical knowledge one can't have sample-accurate synch bewteen audio and MIDI in a normal PC system, because the devices connected to the PCI-bus works asynchronus and so do the buffers associated for audio-playback do. So if one really wants sample-accurate synch, one has to ommit buffers completely, which is impossible in a bus-organized structure like PCs have. Of course, if the MIDI-outs are on the same interface as the audio-ins/outs, but this would require special (additional) hardware, like Jim Wright already wrote. Also, one has to develop a completely new driver structure, because MIDI has no "handshake" lines or other low-control mechanisms. So this would also mean to establish a new standard for that... Not to mention the hassle with the different OS'. cheers, Chris
...maybe I never realized the joy till the joy was gone...
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 20:10:15
(permalink)
I don't think sample accurate sync is really required at this level. Not even close. They are separate things, running at completely different rates of resolution. Audio is typically 44,000/sec for most of us, midi sequences require at the most a few thousand ticks per sec and more likely its in the hundreds; that you need. If you look at Sonar, I don't think you will find any capability to have a midi event start at a certain sample count from the start of the track. You can only tell a midi event to happen at either BAR:BEAT:TICK, or you can tell it start at a SMPTE time, which is a measly 30 frames a sec and lower resolution than TICKS usually. And the BB:BB:TT resolution will take precedence, whatever it is. There is absolutely no notion that the midi event will start at a particular sample number, that I am aware of. There is only a need to (A) make sure the audio sample rate is stable and (B) make sure the midi tick rate is stable and (C) make sure their timebase is synchronized pretty closely, which does not need to be sample accurate for that either. By timebase, I mean have a notion of where zero is on the timeline for both streams so that when you rewind and hit play, they both start playing with stable clocks from the same point in time. Or if you fast foward to a particular point, you can calculate how many samples forward that is and how many midi ticks forward and then start both clocks at the same time, again both with their own clocks running at completely different rates, but nonetheless, as accurately as possible, and located to the same place to start...and playing back accurately forward without drift. (A) sample rate is kept stable with "word clock". By the way, audio sampling can experience its own form of jitter also, the thing is, its happening so fast you don't even realize it. Better quality A/D converters have more stable clocks in them which is one of many reasons they sound better. You can actually make a mediocre soundcard sound better by feeding it word clock from a more stable word clock master source such as the Apogee Big Ben. But even without word clock, most soundcards have their own internal clock that does a pretty reasonable job of sampling the audio at regular intervals, 44,000 times a second with low enough audio jitter to satisfy most of us. (B) midi tick clock rate is more difficult today since with our DAW's most of us are letting WinXP be the clock, and its not very accurate. Even to do a measly 1000 times a second, WinXP can't do it very well. Most midi interfaces do not have an internal clock or ability to timestamp events according to their own internal clock. Only a few do. (C) Once it starts playing, then no sample by sample synchronization can or should occur, they both just need to run with stable clocks. You could, I suppose, have the sample clock trigger the midi clock in some way. either by having them in the same device or by having a midi device that can listen to word clock...or perhaps ADAT sync? Dunno, but I don't think anything does it today. I'm not actually sure if the MOTU's with timestamping can listen to word clock, but its kind of overkill to try to force the midi to be so tightly locked to the sample clock, overkill which requires processing resources. Nah, all that is really needed is for the midi interface to have its own clock that is very stable and to timestamp each event with some kind of timestamp. The problem is how to get a timebase into the midi interface so that the timestamp can then be interpretted by the midi driver later to mean something useful. I guess with ADAT sync you could get a sample accurate timestamp, but the driver would downgrade it to only a Bar:Beat:Tick resolution anyway in Sonar. Word clock could only be useful to establish a stable clock. So I suppose you could have a midi interface that basically timestamps everythign with the same clock rate as the sample rate...and then divide it back down in the midi driver. However, again, how do you get the timebase in there to establish the location in time? ADAT sync is the only way I know of. Anyway, we're waxing theoretical again...not dicussing anything that Cakewalk has any control over. The stuff I highlighted in blue above is what I think we should be focusing on.
post edited by dewdman42 - 2007/10/11 20:28:59
|
pianodano
Max Output Level: -67 dBFS
- Total Posts : 1160
- Joined: 2004/01/11 18:54:38
- Location: Va Beach Virginia
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 20:15:48
(permalink)
ORIGINAL: dewdman42 Pianodano: MTC is the midi equivelant of SMPTE. Research those to find out more. Not related to audio at all. Yep, I know that. But what is the clock source of MTC in Sonar ? Or what is the timing reference ? Based on what has already been stated, I can't see a common source to the audio. Iow, if someone intends to sychcronize something to something else, it just makes sense that the something else would be the reference. Anyone care to clarify ?
Best, Danny Core I7, win XP pro, 3 gig ram, 3 drives- Lynx Aurora firewire- Roll around 27 inch monitor, 42 inch console monitor- Motif xs controller - Networked P4's and FX Teleport for samples- Muse Receptor VIA Uniwire for samples and plugs- UAD QUAD Neve - UAD 1- Sonar X1 but favor 8.5 GUI - Toft ATB 32 - Vintage hardware - Tascam MS-16 synched via Timeline Microlynx -Toft ATB32 console
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 20:25:37
(permalink)
Most likely the clock source for midi and the SMPTE counter(which is even lower resolution than midi most of the time) in sonar is the MM timer, part of windows XP. The sample clock is not really usable for midi/smpte counters because from Sonar's perspective it does not get a smoothly running sample clock timer. See what happens is that that soundcard has as smooth running sample clock inside its own hardware that is filling a buffer. When the buffer fills up, then Sonar is told to come fetch the data as a chunk from the buffer. You control the buffer size in your ASIO control panel. As you know, if you set the buffersize really low the CPU starts to puke. There is no CPU today that could keep up with Sonar trying to react to each sample individually and use that clock to also pay attention to midi tasks. See what I mean?
|
pianodano
Max Output Level: -67 dBFS
- Total Posts : 1160
- Joined: 2004/01/11 18:54:38
- Location: Va Beach Virginia
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 20:52:54
(permalink)
ORIGINAL: Tonmann From my technical knowledge one can't have sample-accurate synch bewteen audio and MIDI in a normal PC system, because the devices connected to the PCI-bus works asynchronus and so do the buffers associated for audio-playback do. So if one really wants sample-accurate synch, one has to ommit buffers completely, which is impossible in a bus-organized structure like PCs have. Of course, if the MIDI-outs are on the same interface as the audio-ins/outs, but this would require special (additional) hardware, like Jim Wright already wrote. Also, one has to develop a completely new driver structure, because MIDI has no "handshake" lines or other low-control mechanisms. So this would also mean to establish a new standard for that... Not to mention the hassle with the different OS'.  cheers, Chris I wonder if I am the only one getting the feeling that the existing underlying engineering/ architecture is a piecemeal affair ? And is there possibly a correlation as to why midi has not been in the forefront of development for years around here . until now . . like maybe they where hoping it would go away ?
post edited by pianodano - 2007/10/11 21:03:35
Best, Danny Core I7, win XP pro, 3 gig ram, 3 drives- Lynx Aurora firewire- Roll around 27 inch monitor, 42 inch console monitor- Motif xs controller - Networked P4's and FX Teleport for samples- Muse Receptor VIA Uniwire for samples and plugs- UAD QUAD Neve - UAD 1- Sonar X1 but favor 8.5 GUI - Toft ATB 32 - Vintage hardware - Tascam MS-16 synched via Timeline Microlynx -Toft ATB32 console
|
pianodano
Max Output Level: -67 dBFS
- Total Posts : 1160
- Joined: 2004/01/11 18:54:38
- Location: Va Beach Virginia
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 21:12:47
(permalink)
ORIGINAL: dewdman42 Pianodano: MTC is the midi equivelant of SMPTE. Research those to find out more. Not related to audio at all. Jim, I don't know anything about hardware or chips, but this is a great idea and sorely needed. If you can get anyone to design the hardware, let me know and I can help with the drivers. I personally think that if someone made an affordable midi interface with timestamping in the hardware and all the correct syncronization built in to work with audio, midi and video setups....and all DAW's universally....well...that would become the standard that everyone would follow. They should have done it like that to begin with. I believe some of the MOTU stuff does some what you are talking about in terms of sample accurate sync....but I think it might only be their high end video related stuff like the digital timepiece that does it. I just checked the back of my Motu midi timepiece. Admittedly, this is the older one, perhaps the new USB one has more stuff, but all I see on the back is a video sync in and a word clock out. I will have to read the manual to find out how that is designed to work. Word clock is really more like a sample rate metronome...so I'm pretty sure the word clock out is just to make sure that my DAW is sample accurate... Anyway, back to Sonar...... This weekend I am going to try to run some tests where I freeze tracks to audio and compare the midi notes to the waves I see on the audio track to see if there is any jitter happening during freeze or mixdown. After that i will try to capture the audio that is playing when I am just hitting play and playing midi tracks through soft synths. then I compare that audio track to the midi track to look at the waves and try to determine if there is jitter happening. Those two situations are what I am most concerned about right now. If Sonar is screwing that up, then Cakewalk needs to get busy.... Dewdman42, I know you said in a later post that the stuff in blue should be the focus,which is cool, but can we really expect the frozen audio to precisely correlate to the midi triger events if there is no exact synchronization to the audio? (Forgive me, but I am playing devils advocate). Furthermore, how could they ? What would would happen if you inserted another copy of the same synth and output the same midi trigger track to that synth, froze and compared the 2 frozen audio tracks. I think I already know the answer but I'd be interested in your best guess ?
post edited by pianodano - 2007/10/11 21:24:38
Best, Danny Core I7, win XP pro, 3 gig ram, 3 drives- Lynx Aurora firewire- Roll around 27 inch monitor, 42 inch console monitor- Motif xs controller - Networked P4's and FX Teleport for samples- Muse Receptor VIA Uniwire for samples and plugs- UAD QUAD Neve - UAD 1- Sonar X1 but favor 8.5 GUI - Toft ATB 32 - Vintage hardware - Tascam MS-16 synched via Timeline Microlynx -Toft ATB32 console
|
Blades
Max Output Level: -43 dBFS
- Total Posts : 3246
- Joined: 2003/11/06 08:22:52
- Location: Georgia
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 21:51:56
(permalink)
I will need to double check this, or maybe someone else can: I recorded some drums with the resolution set to 240. I typically notice, when at 960, my playing is a little behind the beat - by about 15 ticks or so - which usually seems like it's late, though my latest test seems to show that it IS recording what I'm playing, so maybe I'm just a little late - I don't know what would be considered "good" from a drumming perspective, but I am a pretty decent click player. At any rate, I recorded at 240 and my stuff was in the 5 tick off area, which is what I'd expect, and mathematically, it is proportionally off as it was before. So for fun, I bumped the project up to 960. I went into the piano roll, expecting as mentioned above to find it to be about 20, and it showed as "about 5 ticks"...did I just get a tighter recording because I'm only 5 ticks off at 960 now as well? another thought - how do things like pressure, controllers, etc factor into these resolutions? Do you need the higher res to get things like the continous action of things like hi-hat pedals and smooth filter sweeps? I just don't feel like doing the math! Again...I will have to check to see that I REALLY went up to 960 and that it can then record at that rate, and that the 5 ticks REALLY is 5 ticks, not just showing that way or something.
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 22:08:24
(permalink)
(sigh) This stuff is not simple... Well, without getting into another long-winded explanation....let's just say..during mixdown and freezing..its not realtime. Sonar should have all the time in the world to calculate every single midi note to make sure it goes through whatever plugin and produces WAV output that is exactly there it needs to be in the timeline. All of that can be calculated in non-realtime with certainty...and there is no reason there should be any jitter at all in the WAV output from Sonar when midi tracks are being rendered through soft instruments, totally inside Sonar. Remember plugins are not guitar pedals. They don't receive an audio signal from sonar. They are pieces of software that churn numbers and produce more numbers. They are handed buffers of data in non-realtime and spit out more buffers of data in non-realtime, that Sonar then assembles into a final WAV buffer that eventually goes to your soundcard and sounds like realtime music. How all of those buffers are assembled into the final output buffer depends upon the calculated timestamps that are attached to the buffers as all this ahead-of-time calculating is happening. There is no reason at all for midi events to be calculated into the wrong timestamp. If so, that's a bug. In the background, Sonar should be able to calculate every single audio and midi event as having a certain moment in time when it should occur in the final WAV output, down to the exact sample. Now when you are talking about the actual hardware....hardware for audio and hardware for midi..they are seperate, with their own clocks that run in realtime and are not in sync. If Sonar has to generate a signal to the midi interface so that it will send a note to an external synth...well that can't really be calculated ahead of time very well. Some timer needs to kick off at the moment the note needs to be heard, so that the note can be sent to the midi interface in realtime. That's where the timing issues are probably not going to get any better anytime soon since it depends on better hardware integration related to timestamping. Everything I just described above...for lack of a better word, is like virtual syncronization. The audio and midi streams are not really being syncronized perfectly...but rather in the background the midi and audio data is being mixed together into WAV output...and as such every single event can be calculated exactly its location and placed in the final WAV output buffer where its supposed to be. But that is not that same as real time sync.
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 22:11:56
(permalink)
|
pianodano
Max Output Level: -67 dBFS
- Total Posts : 1160
- Joined: 2004/01/11 18:54:38
- Location: Va Beach Virginia
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 22:18:47
(permalink)
Best, Danny Core I7, win XP pro, 3 gig ram, 3 drives- Lynx Aurora firewire- Roll around 27 inch monitor, 42 inch console monitor- Motif xs controller - Networked P4's and FX Teleport for samples- Muse Receptor VIA Uniwire for samples and plugs- UAD QUAD Neve - UAD 1- Sonar X1 but favor 8.5 GUI - Toft ATB 32 - Vintage hardware - Tascam MS-16 synched via Timeline Microlynx -Toft ATB32 console
|
T.S.
Max Output Level: -77 dBFS
- Total Posts : 654
- Joined: 2005/08/11 17:29:16
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 23:25:04
(permalink)
dewdman42 Hi dewdman42, My hats off to you for persueing this like you are, although I'm not sure where your going to end up or how significant your findings will be. I've beat my head against the wall for years with this and all I got was a lumb on the head. Hehe, maybe that's why I'm so brain dead. If your going to do this I'd suggest you use a sine wav or a square wav around 1Khz. Edit them by cutting them off both front and back at a zero crossing and put them in your softsampler. They'll be a lot more accurate than a sidestick or hihat. Since this thread is so long, I haven't read much of it so if this has already been covered I'm sorry, please forgive me. T.S.
|
brundlefly
Max Output Level: 0 dBFS
- Total Posts : 14250
- Joined: 2007/09/14 14:57:59
- Location: Manitou Spgs, Colorado
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 23:32:38
(permalink)
If your going to do this I'd suggest you use a sine wav or a square wav around 1Khz. I've got the sample. A single cycle of a 1000Hz sqaure wave, I used it to do latency testing on audio. But I don't think I have a software smapler in my aresenal, unless something bundled with Sonar6 SE can do it. Dave
|
bitman
Max Output Level: -34 dBFS
- Total Posts : 4105
- Joined: 2003/11/06 14:11:54
- Location: Keystone Colorado
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 23:34:17
(permalink)
Dudes and Dudets, My first instrument was drums. Then at 9 yrs I switched to guitar. When I was much older I got one of those Roland Midi drum sets, and after recording Midi drums I thought, good Lord I suck at this, why did I not hear it like that when I was recording!!?? So Quantize I did, much. Which sterilized a very good part I played. Then I scored a cheap 5 piece acoustic kit for a "house kit" when I started doing others demos. Guess what. I'm a good drummer on that kit. - Midi jitters alright - alot.
post edited by bitman - 2007/10/11 23:44:49
|
brundlefly
Max Output Level: 0 dBFS
- Total Posts : 14250
- Joined: 2007/09/14 14:57:59
- Location: Manitou Spgs, Colorado
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 23:40:39
(permalink)
Here's another interesting article http://www.soundonsound.com/sos/Sep02/articles/pcmusician0902.asp Beware of anything or anyone purporting to know the "truth" about anything. From the second paragraph: "The beauty of VST and DX Instruments is that their playback timing from a MIDI + Audio sequencer is always guaranteed to sample accuracy, since the waveforms are generated slightly ahead of time and outputted just like any other audio track." Would that it were true. Even when rendering tracks offline, the timing of at least some soft synths is not anywhere near sample accurate. Seems like it should be an easy thing to do, but my testing says otherwise. I generated a track of 15-tick-long MIDI note events at 64th-note intervals (60 ticks at 960BPM). I bounced to audio through the TTS-1, using the sharpest drum hit I could find with reverb off so it would be easy to pick out the beginning of transients, and they were all over the place. I haven't crunched the numbers yet, but I know some of them were 100 samples off (about 4 ticks at 120BPM). I'm still playing aorund with this, using different synths, and rendering methods, but it doesn't look too pretty right now. Offline timing seems no better than real-time. Just trying to provide some more data points.
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 23:51:40
(permalink)
Well I just did one simple test..just took a drum midi clip from EZ drummer. Listened to it, in sonar, turned it into a groove clip loop, stretched it out a while, listened to it for a while...seemed pretty solid to me. I froze it. Still sounded solid. I zoomed way in until I could see that 1ms of time was a few millilmeters of horizantal screen space. I could view both the midi drum map view and the wave data at the same time. I put PRV into the track view so I could see the detailed drum map there. Zoomed everything big. i can click on the PRV exactly on the beat lining up the midi event. the line shows up on the track view and exactly lines up with the wav file that was created when freezing the midi track through EZDrummer as a sound module (shrug) So far, I'm happy with that timing, no complaints... I have to go to gym now but later I will try routing the output from sonar into a wave file while playing to see if its off when not freezing or mixing...though its somewhat of a moot point. I don't understand why others are having all-over-the-place results. We will need to dig deeper to try to understand what it is about their test that is causing this. So one of the people having problems, one of us that are not having problems should try to recreate exactly the same test. We should discuss various system aspects to see if that is the root cause also, but right now I'm inclined to say at least here, so far...I am not detecting any problem while mixing down. I had metronome assigned to midi by the way instead of using one of the built in audio sounds. Please give as many details as you can about the setup you used to create this problem and I will see if I can recreate the bad timing here.
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/11 23:53:32
(permalink)
Would that it were true. Even when rendering tracks offline, the timing of at least some soft synths is not anywhere near sample accurate. Seems like it should be an easy thing to do, but my testing says otherwise.
We should try to identify some specific instruments suffering from this and see if everyone can replicate the problem. You say TTS-1 consistently replicates it? I will try that when I get back from the gym tonight
|
Jim Wright
Max Output Level: -66 dBFS
- Total Posts : 1218
- Joined: 2004/01/15 15:30:34
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 00:08:37
(permalink)
Chris, In current PC (and Mac) systems, the audio and MIDI devices do work asynchronously. However, the reasons why have to do more with the use of different timebases for audio and MIDI (and the completely asynchronous nature of MIDI message input/output) -- than with the use of buffers per se. It's useful to consider how Firewire (1394) handles audio and MIDI. Both kinds of data are sent "over the wire' using CIPs (Common Isochronous Packets). Because audio and MIDI data are sent, literally, within the same data packets --- MIDI is forced to be sychronized exactly with the audio sample data. Let me say that differently. At the moment when MIDI messages are packetized for transmission over Firewire -- the MIDI messages are no longer asynchronous -- they are now isochronous. Which is to say, the "implicit timestamps" of the MIDI messages are now fixed rigidly against the running audio sample count associated with the Firewire data packet that contains each particular MIDI message fragment. (I edited the MIDI-over-Firewire spec for the MMA; trust me on this). Now, Firewire data transport - involves buffering. Both audio and MIDI data have to be "queued up" before they can be packetized and formatted into Firewire frames, and then finally sent over the wire. Note that the MIDI data is asynchronous on its way from the sequencer to the Firewire MIDI driver; it only becomes isochronous at the point, during internal driver procesing, where the asynchronous outgoing MIDI messages are converted to isochronous outgoing Firewire data bytes within particular packets. Now -- if the sequencer could write the MIDI data directly into the outgoing Firewire data packets -- the sequencer could control, with very fine accuracy, exactly when particular MIDI messages would occur w.r.t. the audio data that's being sent over the same Firewire pipe. This is not how Firewire MIDI works currently - but it could. Chris noted that a new driver structure would be needed for a hypothetical next-generation MIDI interface. New drivers, certainly. I don't think the driver APIs would necessarily have to change, because DirectMusic (on Windows) and CoreAudio (on Mac) both support high-res timestamps. We would just need a MIDI driver that integrates with the audio drivers to 'do the right things', in lockstep. Ok. Enough theory and speculation for now. --------------- As dewdman42 pointed out - we don't need sample accurate sync for MIDI. I think 1/3 millisecond would be fine for pretty much any purpose. Heck, a 1 millisecond time resolution that was both accurate and reliable should be enough. As dewdman42 also pointed out -- the "MIDI jitter" that bothers most of us is probably much larger than 1-2 milliseconds, and is probably happening because of varous kinds of system configuration issues (or MIDI interfaces with lousy drivers, or various other reasons....) I think most of us would be content if current PC MIDI interfaces worked as well, in practice, as the ones on the 15-year-old hardware sequencers mentioned in many posts on this thread. Those products hardly use sample-accurate MIDI -but lots of people swear by them (and few swear at them, unlike software sequencers...) As pianodano wrote: >> I wonder if I am the only one getting the feeling that the existing underlying engineering/ architecture is a piecemeal affair ? Nope. Frankly, it's totally piecemeal. MIDI was developed starting in 1979 (Dave Smith's 'Universal Synthesizer Interface' AES paper; I was in the audience), and first shipped in 1983. It was designed to be as cheap as possible. It was not designed with any thought of synchronizing with audio, or video, or doing anything but hooking two electronic music products together. Everything after that -- just grew. Different companies proposed different ways of syncing up MIDI with audio, video, whatever. Some of 'em worked, some didn't. Some of the better ideas -- died, because the people with those ideas were lousy businessmen. Some of the sketchier ideas -- made it in the marketplace, because their inventors were much better businessmen than engineers. Kind of like the rest of the computer hardware/software revolution that happened over the last 30 years.... I've been involved with MIDI and AES standards work since 1981, off and on. There is no grand design. There's just a lot of people trying to work out the best compromises they can, considering both what we (music geeks) know how to build, or code, and what the marketplace seems to want. Kind of shocking, isn't it ? - Jim
|
bvideo
Max Output Level: -58 dBFS
- Total Posts : 1707
- Joined: 2006/09/02 22:20:02
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 00:36:27
(permalink)
I have spent a little time today studying the Roland MC-500mkII and my Sonar system on a Pentium 930D (dual core) with a MOTU MIDI express 128 (USB) and Alesis Multimix FW 16. I have no high precision lab gear, so I've done some recording of each system to the other in the presence of some load in Sonar. The load is a bunch of audio tracks, some perfect spaces and some vintage channels, all using about 50% of one CPU. With buffer at 256 and multiprocessor deselected, no clicks were ever heard. My first observation was it took longer to boot the MC-500 than windows XP. I let Sonar use 960 ppq, while the MC-500 uses 96. So ticks are about a half a millisecond in Sonar and about 5 milliseconds in the MC-500 (.00052 and .0052 seconds). Although I did do some experiments with Sonar and MC-500 taking turns being master and slave, I think that is not very informative. So I adjusted Sonar's tempo to 119.93 against MC-500's 120 so that the two were not noticeably out of sync after eigth measures. The data stream I used was eight measures of quantized quarter-note triads. I triggered a JV-80 voice that was pitched and had a sharp attack. There are two parts to this. 1 - view the jitter at the start of quarter note triads by examining an audio recording. 2 - view the jitter among all the notes seen in a MIDI recording. Audio ----- With Sonar playing the MIDI, I recorded the synth. Using audiosnap to find the transients, they were all in the range of 4 to 16 ticks (960 ppq) late. So I say the average latency was 10 ticks and the jitter was -6 to +6. I don't know what part of the system, including the JV-80, contributes to the latency or the jitter. But the jitter could be said to be about -3 to +3 milliseconds. With the MC-500 playing MIDI, I recorded the synth in Sonar and used audiosnap again. Audiosnap found transients with jitter of -5.5 to +5.5 ticks, i.e. just a little less than 3 ms plus or minus. The latency is irrelevant because it involves me moving from one control panel to another to separately start the record and playback. Anyway, this jitter for the MC-500 to trigger the JV-80 is equivalent to what I saw with Sonar. MIDI ---- I recorded MIDI to the MC-500 played from Sonar. The notes of the triad were recorded either 0 or 1 tick (of 96 ppq) apart, and the triads were separated by 95 or 96 ticks accordingly. This is a jitter of -2.5 to +2.5 milliseconds. This means the recording was "damaged" by the occasional insertion of a 96ppq clock tick between two notes in a triad. I believe this is unavoidable under any circumstances that have no electronic synchronization. I recorded MIDI to Sonar played from the MC-500. The maximum jitter in that recording was -7 to +7 960 ppq ticks, or -3.5 to +3.5 ms. The notes of the triads were never more than 8 ticks from one another (i.e. -4 to +4 ticks, or +/-2 milliseconds). In this case, the triads are somewhat less damaged than the MC-500 recording because they are not separated as much from each other. ------- To summarize, I think I saw equivalent jitter in every scenario, roughly +/- 3 milliseconds, and less within the triads. I'm sure this type of setup lacks the precision to see better jitter, much less perform it. Same with me.
|
brundlefly
Max Output Level: 0 dBFS
- Total Posts : 14250
- Joined: 2007/09/14 14:57:59
- Location: Manitou Spgs, Colorado
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 01:47:02
(permalink)
Please give as many details as you can about the setup you used to create this problem and I will see if I can recreate the bad timing here. 1. Open a new project file. 2. Set to 960PPQ and 120BPM. 3. Insert a MIDI track and the TTS-1, and set the output of the MIDI track to the TTS-1. 4. Open the MIDI track's PRV. 5. Create a series of 64th note events with 15-tick duration on Eb6 (this is the Claves sound in GM Drums). 6. I copied them to 128 measures, but you can already see the problem in measure 1, so 8 or 16 should be plenty. 6.5 Oops forgot. Pan hard left, and set output level to max for best results. 7. Select the MIDI track and the TTS-1, and bounce MONO to Track 3. 8. Disabling Fast Bounce made no difference. 8.5 Go get a beer from the fridge. 9. Turn on PRV in the Track View. 10 Zoom till you can see two events and their corresponding wave pulses. 11. Scroll around, and you should immediately see that some transients start closer to the beginning of the MIDI event than others. 12. Zoom further until you can clearly see individual samples. 13. Set the Now indicator to show samples (click it to scroll through the formats). 14. Scroll to the first zero-crossing *after* the beginning of a wave pulse, and write down the sample values (MS Excel can be helpful here). 15. Scroll to the next drum hit, and find the same first zero-crossing, and record the sample value. Repeat several more times, recording the values, and subtracting them to find the interval between hits. 16. If the project is set up as described (and you run 44.1k sample rate like I do), the nominal distance between two drum hits should be 1378 samples. If you get the same result I got, you will find many intervals of 1407 to 1414, with every 4th or 5th one being 1274. If you run a different dample rate, just scale the numbers accordingly. 1274 is 104 samples (4.5ms) short of what it should be, but because all the other intervals are a little too long, things get close to being back on track every 4th or fifth pulse. To me, this looks like a typical case of some sort of cyclic mathematical MODding. Could be a programming error in the rendering algorithm of the TTS-1, or somewhere else in Sonar. I don't know. I haven't yet tried this with another soft synth. Finally, I should mention that you may be tempted to use AudioSnap to find the transients. I did, and found that AudioSnap has problems of its own. I'll leave you to see these poblems for yourself. Suffice it to say that it is only partially helpful in this effort. Phew... I need another beer. Hope I got everything right. Dave
post edited by brundlefly - 2007/10/12 02:15:46
|
T.S.
Max Output Level: -77 dBFS
- Total Posts : 654
- Joined: 2005/08/11 17:29:16
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 03:08:55
(permalink)
ORIGINAL: brundlefly If your going to do this I'd suggest you use a sine wav or a square wav around 1Khz. I've got the sample. A single cycle of a 1000Hz sqaure wave, I used it to do latency testing on audio. But I don't think I have a software smapler in my aresenal, unless something bundled with Sonar6 SE can do it. Dave Hi Dave, Actually I'd make it more than a cycle. Because of slew rate and other factors I doubt it will reach maximum output. You also need enough cycles to readily see it in the wav editor. If your going to use 64th notes then make it a little less then that long. Keep in mind that at 120-BPM, a quarter note is 500ms so working at 120-BPM is a good tempo for configureing your time. The thing about useing sidestick, hihat, or other percussive sounds is that unless you can go in and see exactly the way they've edited the wav forms then you really don't have a clue where you're at. However, if you don't have a softsampler then you may not have a way of useing your own wav forms. At any rate, I'm very interested to see what results you come up with. T.S.
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 13:39:36
(permalink)
I will try the TTS test later tonight
|
brundlefly
Max Output Level: 0 dBFS
- Total Posts : 14250
- Joined: 2007/09/14 14:57:59
- Location: Manitou Spgs, Colorado
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 13:42:02
(permalink)
Actually I'd make it more than a cycle. Because of slew rate and other factors I doubt it will reach maximum output. You also need enough cycles to readily see it in the wav editor. If your going to use 64th notes then make it a little less then that long. 1 cycle of 1kHz works great for me on all counts. I can hear and see them clearly, and they go to exactly the level I set (-3dB in this case - gotta keep the output level turned down!). The clip view of the original wave is a perfect square. When you re-record it through an anlalog input, you get a little bit of "anticipatory" rise in the signal level for about a dozen samples before the leading edge of the square, but then it pops right up to peak level in no more than two samples with the vertical parts of the recorded pulse lining up perfectly with the source. It's a thing of beauty. I can send you a screenshot if you'd like (or maybe post one somewhere, so I can link to it in a post on the forum). The thing about useing sidestick, hihat, or other percussive sounds is that unless you can go in and see exactly the way they've edited the wav forms then you really don't have a clue where you're at. I'm not sure what you mean by this. Again, it worked perfectly for my purposes, except for having to painstakingly pick out the first zero-crossing in each pulse. The square wave would have made this a lot easier. At any rate, I'm very interested to see what results you come up with. I think I've pretty much described most of the key results. There are a lot of interesting little details I observed along the way (like AudioSnap completely missing transients identical to the dozens of others that it found, or putting the transient marker in the middle of the pulse where there is a tiny modulation of the base frequency) that were a little off-topic, so I've left them out for now. Let me know if you're looking for something specific that you wanted to hear about. The only other test I'm thinking about doing right now is a re-run the rendered-wave timing test with a different soft synth, but I only have the ones that came bundled with Sonar, 'cause I've always used hardware synths. I'm curious to see what results people get from other DXis/VSTs. Incidentally, I've done a lot of data analysis and hardware/software product testing and development support in my career. That's where both my interest and creative testing methods come from. My last job was with a company using the rectified-but-unfiltered signal output of cathodic protection rectifiers to analyze coating condition on underground oil and gas pipelines, so this work is very similar to what I did for them.
post edited by brundlefly - 2007/10/12 13:53:25
|
brundlefly
Max Output Level: 0 dBFS
- Total Posts : 14250
- Joined: 2007/09/14 14:57:59
- Location: Manitou Spgs, Colorado
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 15:02:05
(permalink)
Keep in mind that at 120-BPM, a quarter note is 500ms so working at 120-BPM is a good tempo for configureing your time. I thought the same thing initially, but I've just decided the magic number for testing at 44.1kHz is 56.25BPM. This is the highest tempo you can have that will give a whole number of samples per tick at 960PPQ. The resulting values of importance are: 900 ticks/sec (1.111... ms/tick) 49 samples/tick 47040 samples/beat (at 4/4) 30 ticks/frame I figure this should take all rounding errors and related artifacts out of the system if the timing master is samples. My initial test of TTS-1 rendering at this tempo still found timng errors of 100 samples or more, though at this tempo, that's only about 2 ticks. Correction: Looks like 78.75BPM is the highest tempo that will give a whole number of samples/tick, 35, which is a nicer number in some ways, but ms/tick gets uglier: 1260 ticks/sec = .79365... ms/tick 35 samples/tick 33600 samples/beat 42ticks/frame Correction 2: Doh! Brain not working so well today. 131.25 (21 samples/tick) and 183.75BPM (15 samples/tick) will also work, and stay under the 200BPM limit (?) in Sonar. But I found that the lower tempo 56.25 still works best because AudioSnap finds all the transients in roughly the right place. Apparently the higher tempos have the tails of the pulses running into the next beat, which confuses AudioSnap, even though the transients still look clear. Details, details, and more details...
post edited by brundlefly - 2007/10/12 15:48:02
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 15:41:06
(permalink)
its the millesconds you need to care about, not ticks. The number of ticks is irrelevant when rendering through a VSTi. And I think that the timing should be perfect in this case.
|
pianodano
Max Output Level: -67 dBFS
- Total Posts : 1160
- Joined: 2004/01/11 18:54:38
- Location: Va Beach Virginia
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 15:46:18
(permalink)
ORIGINAL: Jim Wright As pianodano wrote: >> I wonder if I am the only one getting the feeling that the existing underlying engineering/ architecture is a piecemeal affair ? Nope. Frankly, it's totally piecemeal. MIDI was developed starting in 1979 (Dave Smith's 'Universal Synthesizer Interface' AES paper; I was in the audience), and first shipped in 1983. It was designed to be as cheap as possible. It was not designed with any thought of synchronizing with audio, or video, or doing anything but hooking two electronic music products together. Everything after that -- just grew. Different companies proposed different ways of syncing up MIDI with audio, video, whatever. Some of 'em worked, some didn't. Some of the better ideas -- died, because the people with those ideas were lousy businessmen. Some of the sketchier ideas -- made it in the marketplace, because their inventors were much better businessmen than engineers. Kind of like the rest of the computer hardware/software revolution that happened over the last 30 years.... I've been involved with MIDI and AES standards work since 1981, off and on. There is no grand design. There's just a lot of people trying to work out the best compromises they can, considering both what we (music geeks) know how to build, or code, and what the marketplace seems to want. Kind of shocking, isn't it ?  - Jim Great ! We have DAW's designed by commitee.
Best, Danny Core I7, win XP pro, 3 gig ram, 3 drives- Lynx Aurora firewire- Roll around 27 inch monitor, 42 inch console monitor- Motif xs controller - Networked P4's and FX Teleport for samples- Muse Receptor VIA Uniwire for samples and plugs- UAD QUAD Neve - UAD 1- Sonar X1 but favor 8.5 GUI - Toft ATB 32 - Vintage hardware - Tascam MS-16 synched via Timeline Microlynx -Toft ATB32 console
|
brundlefly
Max Output Level: 0 dBFS
- Total Posts : 14250
- Joined: 2007/09/14 14:57:59
- Location: Manitou Spgs, Colorado
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 16:03:40
(permalink)
its the millesconds you need to care about, not ticks. The number of ticks is irrelevant when rendering through a VSTi. And I think that the timing should be perfect in this case. I think it's really the sample interval that's the key, since there are 44.1 samples/millisecond. Since the MIDI events that are being rendered to audio are "quantized" to 1-tick intervals, you'll get small rounding errors if there isn't an even number of samples per tick. Turns out not to make a real difference in testing, since the rendering errors are on the order of 100 samples, but I wanted things to be as clean as possible.
|
Noel Borthwick [Cakewalk]
Cakewalk Staff
- Total Posts : 6475
- Joined: 2003/11/03 17:22:50
- Location: Boston, MA, USA
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 16:27:22
(permalink)
You are confusing when jitter can take place. You can't get MIDI jitter purely within SONAR while bouncing or freezing. (irrespective of whether its a fast or slow bounce) MIDI events stored in a SONAR project are timestamped. When you bounce to a softsynth, the MIDI events are fed to the synth with the timestamps implicitly locked to the audio buffers. There is no chance of jitter occuring in this scenario since the synth knows exactly when to render the audio for the MIDI based on the current sample position and the timestamp on the MIDI event. Jitter doesn't apply at all here. ORIGINAL: dewdman42 Anyway, back to Sonar...... This weekend I am going to try to run some tests where I freeze tracks to audio and compare the midi notes to the waves I see on the audio track to see if there is any jitter happening during freeze or mixdown. After that i will try to capture the audio that is playing when I am just hitting play and playing midi tracks through soft synths. then I compare that audio track to the midi track to look at the waves and try to determine if there is jitter happening. Those two situations are what I am most concerned about right now. If Sonar is screwing that up, then Cakewalk needs to get busy....
|
dewdman42
Max Output Level: -74 dBFS
- Total Posts : 839
- Joined: 2004/09/20 16:37:27
- Status: offline
RE: MIDI "Jitter" - It Does Exist
2007/10/12 16:29:52
(permalink)
oh, i see what you're saying. Well in any case, 100 samples off is not ok. The midi ticks can get confusing since it depends so much on tempo and resolution. But even if there is a rounding error and off by one sample or something, its not worth worrying about. Pick any old tempo and any PPQ, the rendered audio should be within a sample or two of exactly where you would expect it to be. Me personally, I would be satisifed with the audio being anywhere within 1ms of where I expected it to be.
post edited by dewdman42 - 2007/10/12 16:43:11
|