Notation/Staff view can help your mixes, even if you don't read music
This idea became very apparent to me recently on my latest finished project.
For reference, you can read about the project, which I published in its entirety for the Cakewalk (Sonar) community here:
http://forum.cakewalk.com/tm.aspx?m=1914898 (but you don't have to, this is a generic technique).
Here's the gist: If you have any sort of midi data in your projects (controlling hardware/software synths), it's easy to pop open your project in Staff view occasionally and get a "picture" of the voicings you use for each track. By that I mean, where are your notes for an instrument going; are they high, low, high sometimes then low, high and low at the same time, etc. Where your notes are, tone-wise, is a crucial thing to know when mixing and fooling with filters, EQs, and plugin settings.
Yeah, you can see this note movement or note positioning in piano roll view (PRV), but Staff view makes ranges of notes stand out more, at least to me. AND each track is on its own line/staff in Staff view (each track is separate). All the tracks are not sitting on top of each other, as in PRV, which is a different way of looking at things, so maybe your brain gets jogged in different directions and you notice something new.
Maybe you notice, he added ominously, some or all of your tracks end up on
TWO staves (plural of "staff"). This can be a red flag for you when mixing. Why? (Glad you asked.) Because, these darn two handed keyboard controllers that musicians use (modeled after the pianoforte keyboard, one of the great contributions to world culture) encourage our two hands to go all over the place, sonically speaking. An 88-key keyboard covers a huge chunk (the majority) of the notes that define musical tones to the human ear. (Darn, I should've stated up top that if you are a dog, cat, bat, or blue whale browsing this post, you can probably skip it as it doesn't apply to you.) So, I know that when I whale--I mean, wail away on the keyboard, my hands sometimes move way far apart from each other, like the north ends of two magnets. In doing so, they start to split the sounds coming from whatever instrument is being controlled into basically two instruments: one sounding low in double-digit HZs and one sounding high in 4 digits HZs (maybe an extreme example, but consider that the A's on a standard keyboard go up the audio scale something like 22.5, 55, 110, 220, 440, 880, 1760....even more if your synth is transposed up or down an octave or two).
And if you end up playing basically two different instruments (the same timbre, but vastly different frequencies), you really ought to consider:
a) scrutinizing your EQ curve for this instrument, and if you don't have one (any EQ at all), you probably have to add one. (Or trust to good fortune that your two instrument sounds coming from one track just magically sit in the mix perfectly as played--do you feel lucky?) Your curve might very well have to be camel humps (two or--yeesh--three).
b) splitting this track into two (or more) tracks, each with its own synth and one track with the high notes, one track with the low notes; (Yeah, yeah, I know, double the synths, then the CPU starts to climb. Everything is a compromise, didn't you know that?)
c) going back and re-arranging some tracks. (Stop glaring at me, sometimes that's the best/only solution. I hate doing it too.) Maybe you really should be using two totally different instruments (different timbre synths). That's what a symphony orchestra does: Bassoons get to play high notes and flutes play low ones. (Um, wait, do I have that correct?....but you get the idea, right?)
(Lately, I've taken to splitting my finished piano part into two tracks, one for each hand, with two identical piano synths (although they could be different sounding piano synths, I'll have to try that!). The right hand track gets EQ'd differently then does the left hand track. "The right hand does not know what the left hand is Q'ing," as they say.)
Back to Staff view. Things to watch for, that quickly pop out when viewed in staff view.
First, does an instrument have spots where the notes go way above the staff, and spots where notes go way below the staff? If so, then the tones of your instrument are all over the map, and what might work for one EQ setting in one measure, might be totally inappropriate in another measure.
If there are spots on the staff where there are notes both way above the staff and below
at the same time (which can be pretty standard for a piano player) then, whoa, there is a possibly a serious EQ issue here. Low notes and high notes at the same time? What frequency range does this track sit in? Do you focus settings on the low notes or the high notes? If you emphasize a broad range, to fit all the notes, high and low, you're kinda, sorta, not emphasizing anything, just making it louder.
Sometimes Sonar helps guide you all by itself, because the program automatically splits your tracks into treble and bass staffs if it sees notes going up and down, high and low. If you track has two staves (treble/bass) assigned to it, that's the flag I mentioned; maybe your playing style (voicing) with two hands has generated, in effect, two different instruments, sonically speaking. How do you EQ that track? Where should it sit in the mix? Emphasize the low notes? But then maybe the high notes disappear, or worse, they sit on some other track and squash it. Etc, etc. Time to split to two tracks?
One more thing to look for in Staff view (there's other things to look for, but I have to wrap up and go): are your notes all sitting squarely on the treble staff? That's something to watch for as well, because (for historical reasons), the treble staff sits right smack in the center of our musical hearing, where (more or less) the human voice sings/talks. So, if a lot of your tracks are hanging around the treble staff, then they're sitting on each other, or on your vocals, or on both. (You can maybe see this more clearly in PRV, but again, the Staff view displays it differently, so something might jump out at you. Not literally jump, mind you, it's only a computer.) The solutions to too many treble tracks ("The trouble with trebles" for you old-time Trekkies out there) is probably
not splitting tracks, but <egad> cutting out tracks, or at least transposing them away from the middle.
Somebody was talking to me (virtually) about the completed project I mentioned at top and had a question about the "score" of the piece, which I hadn't really looked at, because, while I read music, I have gotten away from using traditional notation when working inside Sonar. (Sonar focuses on the piano roll, and not on traditional notation, he said politely.) This conversation led me to peek at the score anyway, and boy did I see conflicting voicings popping out: way high notes and way low notes on the same track. Yikes! Wish I had peeked while I was working on the project instead of after it was done and out in public. Oh well, assuming I don't forget everything I just typed, I'll check the notation during mixing (and maybe even during composition) next time and my next project should sound even better for it.