Great answers, all. I've been using MIDI almost since its inception circa 1982 (close to 25 years). I've been sequencing with all manner of hardware sequencers since 1986, and more recently software sequencing w/ 4 versions of Sonar. I definitely notice various sequencers being more or less accurate at recording and playing back un-quantized MIDI performances.
A couple of comments:
1) I am using 960PPQ. However I have recorded into MPCs w/ 1/10th that resolution (96PPQ) and gotten results which to my ears sound more accurate than what I hear with Sonar (maybe more accurately stated: with software-based MIDI recording solutions, since I've not tried other software products such as Logic, DP, Cubase, etc.). So from my perspective, clock resolution is not as important as marketers would have you believe. For example set a metronome at 120BPM. Listen to each beat and try to conceptualize it being divided into 96 parts. Now 960 parts. What's the difference. If MIDI were being recorded with perfect accuracy, 96PPQ would be completely sufficient.
2) As I understand it, MIDI is a relatively (being that it's now 2007) slow protocol, as well as being a serial interface. Since it seems that I was getting equal or better results 20 years ago, I must assume that Firewire vs. USB would not make a difference in terms of causing a MIDI bottleneck. Certainly it's a concern w/ audio, but I don't think so with MIDI. People have been sequencing with software-based solutions for at least 20 years, when computers were much, much less powerful than even today's budget models.
3) LOL, Dave, my PC's a little more modern than that.
My theory (completely unscientific), is that as these software DAW applications have gotten more and more bloated, they put less and less emphasis on tight MIDI timing in the rush to "gold-plate" the application. This combined with how a computer's OS deals with incoming and outgoing data (handling multiple housekeeping chores at once, but still in the final analysis, one at a time, such as mouse moves, screen repaints, etc...) just begs for MIDI timing to be less than 100% accurate. And why should a manufacturer put emphasis on it? Many Sonar users in this forum seem to be guitarists or bassists who are just thrilled to have audio recording available outside of an expensive commercial recording studio. Many users have no idea what MIDI is, how long it's been around, or how it differs from audio recording (get Scott Garrigus' book).
All I know is that my ears hear the difference when MIDI is recorded and played back accurately, versus sloppily, and I am very happy to see that at least one software manufacturer has acknowledged this issue, and has apparently done something about it.
I wish Cakewalk would put aside their rush to audio nirvana for a minute and do some serious MIDI recording and playback tests - i.e. have a seasoned professional keyboardist record MIDI data in real time into Sonar and at the same time record the audio into Pro Tools or something similar. Now record the output of Sonar's MIDI playback from the sound source again into Pro Tools. Compare the timing of the 2 tracks at a high level of magnification. I think a difference would be easily perceptible.
(Edited for grammar/spelling)