ORIGINAL: dewdman42
ORIGINAL: RTGraham
but it probably also means that the "slop," which is millisecond-timer-related (as opposed to MIDI sequencer clock related), spans twice the number of clicks when it does rear its head.
exactly
Right... however, I can see how doubling the tempo might still be useful to Steve... if he happens to be using an extremely stable MIDI interface, where he's already minimizing his jitter, then he'll be capturing the subtleties of his performance with finer resolution overall, jitter notwithstanding. Wonder what interface he's using.
Which also makes me wonder... I know a lot of people who have old Emagic Unitor-8 interfaces that they never gave up, even though they're out of production, because they feel they're extremely stable. I'm talking about musicians who are serious about their timing... I wonder what kind of jitter characteristics those interfaces have. Granted, some of them are being used in Mac environments, so the entire communication mechanism might be different.
EDIT: Just did some searching on the Unitor-8 interfaces... Emagic had incorporated, at the time, a new "technology" called AMT. It sounds, essentially, like hardware timestamps that get decoded only by the same manufacturer's software -in this case, Logic. Being that Apple now owns Logic, one could hypothesize that they incorporated some of those concepts into CoreMidi, which would make the Unitor a particularly stable interface on the Mac now, regardless of what application (i.e. Pro Tools) accesses it.
An interesting scenario, if it's true. And an argument for Microsoft to do something similar with Windows: based on all of the posts in this thread, it's becoming clear (at least to me) that one solution, and probably a good one, would be for Windows to incorporate a kernel-level MIDI API that would support better clock resolution and timestamping.