https://www.dropbox.com/s/uj2qqw6a8oofcns/miditest.cwb Here's a simple test project I created in Sonar, consisting of a few midi tracks
and one instance of Battery 4 (vst). I rendered the audio of this project within Sonar.
I then exported the midi parts, saved the Battery kit, recreated exactly this simple project in
Studio One. I rendered the audio of this cloned project within Studio One.
I then loaded both rendered files in Sonar.
Both Files are starting at 0'00''.
The files are sample-aligned.
Ok, let's try to null those babies... they don't null.
I have to pull down Sonar's rendition by .2 or .3 db to get close to some kind of null but without ever reaching it.
Even more interesting is that at .1, .2 .3 and .4, different elements of the drums get nulled, but never
the whole drum part. For example at -.2 I get almost a null on the clap, and at -.3 I get almost a null on the kick and snare.
Is it timing? is it sound? Is it the way each software handles the mixing of transients that hit at the same time?
is it velocity interpretation by both daw's?
Musically speaking now, my gut feeling is shared by many here at the studio. In Sonar's, there's almost to many information, like if the different elements weren't talking to one another... like if every elements was separated from the others. it's a weird feeling, not very musical and almost ''cold''. In S1, it's almost like there's something not SONICALLY but MUSICALLY correct about the way
the drum feels
p.s :
If I do the same experience but with an audio part or an audio loop instead of a midipart/softsampler project I get a complete null.
So this issue is only related to the way sonar handles midi playback and softsynths.
p.s2:
we've tested with live and logic as well and found similar ''musically correct'' attributes.