• SONAR
  • X3 Producer: Why is it so difficult to record audio from a soft synth? (p.14)
2015/02/21 14:03:57
Earwax
Anderton
Earwax

 
If you recorded audio, whatever you recorded would be "frozen" and you would not be able to alter tempo changes or time signature changes. The feel you captured with audio would be the feel you captured and there is nothing you can do about it, except for intricate time-stretching, cutting, pasting, etc.
 
Just like audio, if you record MIDI in real time and it has the right feel, it will play back with the right feel. If you record MIDI in real time with the wrong feel, it will play back with the wrong feel. You do not have to conform MIDI to a tempo or grid, you can record it simply as free-flowing data.
 

 
Neither does MIDI, if all you want to do is capture a live performance of someone playing an instrument with MIDI.
 

 
You don't have to use stand-alone. You could host multiple instruments within a program like SONAR or anything else capable of hosting VSTs, or use Reason's instruments and grab their separate outputs. Or if you have to use multiple stand-alone instruments, you could use Windows audio drivers, or Mac laptops with Core Audio. Apple's MainStage is designed specifically to host instruments for live performance because Logic, like SONAR, is designed for production and not live performance.
 
I have already acknowledged that the suggested workarounds can work okay, depending on the elements involved. As an improvising musician who enjoys recording live with others, on the other hand, I find the current situation to be sorely lacking. If my computer can handle it, it makes sense I should be able to do it.


What I Italicized in your quote is where "the rubber meets the road." I understand what you want, and why you want it. But I just don't think today's computers running sophisticated DAWs and power-hungry VSTis are capable of doing what you want with sufficiently low latency to match the experience of recording instruments into something like a stand-alone hard disk recorder. Sure, you could record the audio from a few instruments at a time in to the computer, but if you load the computer up with VSTis, all bets are off. You could record at 96 or 192 kHz to reduce latency, but then you're limited to the number of available real-time audio streams. As I mentioned, Thunderbolt II might be the answer but its widespread adoption is still a ways off. 
 
If you really want to do this and don't want to use a stand-alone recorder, I would suggest Ableton Live. Probably even a "lite" version would do what you need. Unlike SONAR and other DAWs, which are designed for production, Live is designed and optimized specifically for live performance. There are lots of things it can't do (like comping), but it's a very agile program and it's what I use for live performance because it does that better than any other computer-based program. Programs like SONAR and Pro Tools, regardless of whether or not they can physically record an audio output with the computer, were never designed for live performance. You'll still have latency issues with Live, but if Live can't do what you want, then there's probably no computer-based solution at this time that will do what you want.
 
Think of it this way: You have a van so that you can take your family places, pick up groceries, take vacations, transport your gear to the gig, etc. If you want to be able to take curves at 70 mph on the Amalfi drive, you need a sports car.




Craig,
 
Thanks for that! It was very helpful. A few thoughts -
Sanderxpander’s comment above regarding use of Sonar’s metronome sparked further thought about something I had just taken for granted.
 
It seemed that whenever I tried using Sonar’s metronome for “pulse” BPM while recording live, it always messed up my MIDI performance. The MIDI playback was not what I originally played. For example, if I set the metronome to play one-one-one-one  at 115bpm, I can play whatever I want (subdividing time, changing time signatures, playing behind the beat, playing polyrhythms, changing tempo, etc.) with the 115bpm pulse as a reference to keep me in time and to come back to after intentionally speeding up or slowing down. However, when playing back my MIDI performance, it does not match my audio performance. Based on what you say, I have obviously been doing something wrong?
 
Clearly, having the metronome in place helping to record all things MIDI  “in time” while recording all audio data makes it easier for musicians to stay in time with any musical moves made. Can a musician play a VSTi with non-MIDI recordable elements, use those elements in conjunction with one VST’s MIDI (not audio) output (let’s say a multi-tap delay), and another VST’s audio output (let’s say an amp sim), and use Sonar’s metronome to stay in time?
 
What I’m after (and it appears you understand this) is the musician having the best of all worlds. If I, with 1 or 2 other musicians, can record
 
(1) Any VSTi’s output live -allowing me to manipulate non-MIDI recordable elements and capture “free-wheeling” sonic elements
(2) Use MIDI’s time keeping capabilities where needed
(3) Record MIDI VST automation
(4) Route the recording of all of that to any and as many tracks as I want
all in real time, I’d be a happy guy.
 
Is it doable now? Except for the metronome problem (maybe not a problem, or maybe a problem that can be easily solved), we can come close with the workarounds we have now.
 
As always Craig, thanks for your time!
 
 And yeah, I do LOVE Ferraris!!
2015/02/21 14:15:18
Sanderxpander
Perhaps you had record quantization on or something? I've never noticed this and unless something is wrong it really shouldn't happen. Midi records the note when you press the key. Unless something is set to change its position in time it will play back exactly where you originally played it.
2015/02/21 14:24:01
swamptooth
Could be your preferences/project/clock ticks per quarter note isn't maxed out.  Sonar only goes up to 960, so there may be some timing issues based on that as well.
2015/02/21 14:32:17
konradh
My workaround is to freeze the track and then Shift+Control drag the audio to another audio track.  Then I can unfreeze.
 
The result is an audio track as though it were recorded by routing the VI to another track.
2015/02/21 14:34:43
Sanderxpander
That was pointed out before too but we're talking specifically about LIVE recording.
2015/02/21 14:56:02
JoseC.
Earwax

Maybe slaving MIDI timing to the audio buffer makes it easier to accomplish the task of VST/VSTi live audio recording. I don't know. It would be interesting indeed to find out what the trade offs are.


I agree about finding out what the trade offs are. Personally I am under the impression that slaving the midi stream to the audio buffer leads to midi jitter and timing inconsistencies, but this is only based on my own intuition, and comparing the way Sonar behaves when recording live midi from other synced hardware sequencers and arpeggiators with other DAWs on which midi sequencing was a late addition to the feature set.
2015/02/21 15:30:09
brundlefly
I would suggest that any MIDI/Audio/Metronome timing issues be taken to a new thread for troubleshooting. That's not typical, and shouldn't be used as a justification for needing real-time soft-synth recording.
2015/02/21 15:56:38
Jeff Evans
I find the ability to record VST instruments and tracks and buses etc in real time very handy. Yes the feedback loop can sometimes be an issue but you just get into the habit of recording without monitoring. Simple really.  I have got a few VST's synths that never repeat themselves and this is must during experimentation.  Great to capture moments of greatness when it happens.  It is faster and easier to be able to switch this on inside your DAW software.
 
I also work with a digital mixer and can send stems from my DAW software to it enabling real time control.  It is still fatser to put a track into record, record your main mix while you make moves in real time.  Sometimes the nicest things happen that way too.  It is good to be able to record them when it does happen.  There is something magical in having the human touch over some aspect of a mix while it is being printed.
2015/02/21 17:01:25
Anderton
swamptooth
Could be your preferences/project/clock ticks per quarter note isn't maxed out.  Sonar only goes up to 960, so there may be some timing issues based on that as well.



960 ppq provides 500 microsecond resolution at 120 BPM, and I doubt keyboard players can play with that degree of accuracy. And even if they could, I think other variables in the computer (and the MIDI controller, assuming it has a scanned keyboard) would have greater impacts on resolution. With a hardware MIDI devices going through the 5-pin DIN connector, each note is about a millisecond apart so you'll get a minimum 10 ms spread among a chord played with both hands (plus more from the keyboard scanning times). Doing MIDI-over-USB gives much better results.
 
I did an interesting experiment on the Mac many, many years ago for Keyboard magazine. Computer jitter increased linearly with higher resolution, so you had the same resolution whether you were recording at 48 ppqn or 1,024 ppqn! I'm sure things have improved since then...at least I hope so...
2015/02/21 17:08:34
swamptooth
Just throwing it out there because a post in the pf a couple of months ago was from a user trying to import midi from reason which records and exports at something like 15000 ppqn... He claimed the lower resolution downgrade caused significant timing issues.
© 2026 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account