Hi
Rain yeah thanks. I love all this stuff and have been involved with it for years. The good thing now is the playback method for the visuals and how it connects to our DAW is much better now than before. In the old days it was a Betacam SP machine or something generating timecode and that feeding into a synchroniser which in turn could control your DAW. We were freeze framing a lot on video VHS machines and things. Today we just import the vision
(composers copy that is) into a track and it plays back. Reliably too. We can do this all day long and the vision will always be in sync and in the right place with our DAW time ruler.
It is easier to change tempos after the fact now as well. The first frame of code that starts your DAW is important, as well as the tempo of your cue. In sound design though you can stay at 120 BPM which is cool because that all relates to time really well. Every beat falling exactly every half second.
There are different frame rates for film and TV/Video. Film is nice neat 24 frames per second which is the most logical from a music perspective. For video and TV we are lucky in Australia because we have chosen the mains frequency of 50 Hz we can fit a neat 25 frames into every second. In the US, 60 Hz is the reference instead and you guys are squeezing 30 frames into each second for TV/Video. You guys have to do some timecode trickery to make it all work. You do have slightly finer resolution. Here our frames are 40 mS apart. In the US they are 33.3ms. The idea is to align the timecode in the vision you are working with and the timecode in your DAW and the good news is it is all much easier now. You can scrub in slow motion now around the vision hit points and see hit points in slow motion which also makes it easier.
The number of frames that fits into a 4/4 bar say at a certain tempo varies all the time. How frames relates to 1/16th notes changes. The film composer has got bigger fish to fry though and that is what is the music going to be saying under any given cue at any point. It is all about starting and moving at a certain pace and the music full filling its role very well along the way. You don't have to sweat the individual frames so much but rather hit points which are much further apart like many seconds. The music can cruise over many vision edits. It has the effect of smoothing out things visually. You can see hit points coming along way ahead so it is about how you lead up to it and land on it musically that the technical part creeps in. Adjusting your tempo at various points so you land on it where you want to. eg Beat 1 of a bar. Modern DAW's have got this all sorted now which is great.
For some reason the TV/Video guys like all their audio at 48 kHz so you need to get into the habit of making music sessions and productions at that sample rate.
(not critical because they can up sample when importing 44.1kHz music cues into their own video edit software.) It is not a bad idea to get right into how your own DAW software handles all the video stuff. Tempo maps, how timecode is embedded into your music tracks and the relationship etc.. There will plenty of Logic tutorials around on the subject. A good book is OK too. Make sure it is up to date in terms of how we work with vision right now. As I find them I will post some links. Logic handles it very well. I have spent most of my music for vision experiences in Logic.
If it's music then it is really about making the most appropriate music for that vision. If it's (realism) sound design it’s about atmos and Foley effects to re build the scene sound wise. A new sound designer has emerged concentrating on unusual effects and out of this world effects. I see some territory in between music and sound design too. You can incorporate sound design into music cues and visa versa.