Setting up an appegiator plugin in Sonar

Author
Paul P
Max Output Level: -48.5 dBFS
  • Total Posts : 2685
  • Joined: 2012/12/08 17:15:47
  • Location: Montreal
  • Status: offline
2015/12/27 03:31:56 (permalink)

Setting up an appegiator plugin in Sonar

I've been playing around trying to get two arpeggiators to work in Sonar (the old free Kirnu and BlueARP).  I finally succeeded, but the setup required was not what I had expected, and doesn't seem right conceptually.  I've drawn up a few diagrams to illustrate what I think is or should be going on and would appreciate comments on why the end result has to be what it is.  For one thing, I have a feeling that the Synth Rack is a bit different than what its harware equivalent would be.  What follows can also be treated as a roundabout tutorial on how to set up an arpeggiator in Sonar.

Here's the basic setup of what is needed, not necessarily how it has to be implemented using Sonar.  An Arpeggiator and a Synth are loaded in the Synth Rack.  MIDI notes from playing a keyboard are sent to the Arpeggiator, which does its thing and sends modified MIDI notes to the Synth.  The Synth then does its thing and outputs AUDIO.



It seems to me that it would be nice if this could be done directly in Sonar, but there is no way to assign input or output to instruments in the rack.
 
Next we add the ability to record at various stages.  The raw MIDI notes from the keyboard can be recorded to a midi clip (in a midi track).  The MIDI output of the Arpeggiator can also be recorded to a midi clip.  Finally, the audio output of the Synth can be recorded to an audio clip (in an audio track).



It is not possible to set things up this way in Sonar, though the end result is sort of similar and the three stages of recording are possible.  The difference is that Sonar doesn't have 'cables' to connect components together, it only has tracks.  So you can't connect a keyboard to an instrument, you must create a track that takes its input from the keyboard and sends it along to the instrument, while possibly also recording as it does so (much like Gibson's Memory Cable).



Somehow in Sonar instruments and tracks are not separate and independant, they are joined in ways that I can't quite figure out.  Especially in the case of a midi instrument whose only purpose is to process midi and send it on to another midi instrument.  For example, trying to get Kirnu to work, I started by loading it in the Synth Rack and also loaded an instance of Dimension Pro playing a piano patch, neither having any tracks assigned to them.  I then created a MIDI track 1 that took input from the virtual keyboard controller and output MIDI to Kirnu.  I created another MIDI track 2 that took its input from Kirnu and output MIDI to Dimension Pro.  And I created an audio track 3 that took its audio input from DimPro and sent it on to the Master bus.


 
With this setup in Sonar, MIDI from the keyboard reaches the MIDI track 1, but goes no further.  This also does not work with BlueARP (and worse, BlueARP's GUI window is completely blank).  This setup is however how I think it should be done (though I would prefer cables instead of tracks to link things together).

For reasons I don't understand, what needs to be done is to create an AUDIO track that takes its AUDIO input from Kirnu and outputs it nowhere (none).  Suddenly everything works as expected and in the case of BlueARP, its GUI also magically appears.  (Note that input echo must be on for both MIDI tracks for content to move forward).  This doesn't make sense to me.  Not only do Kirnu or BlueARP not produce any audio, what's the point of a track that goes nowhere ?  This audio track can be muted or not, it doesn't matter.



Could someone please explain to me why this audio track is needed ?
 
An Instrument Track can be used to hide this mystery audio track but to me that is even more confusing since you can't tell what it is that makes things work.  And in this case the Instrument Track's meter shows no activity, even though notes get through to DimPro.

post edited by Paul P - 2015/12/27 04:48:53

Sonar Platinum [2017.10], Win7U x64 sp1, Xeon E5-1620 3.6 GHz, Asus P9X79WS, 16 GB ECC, 128gb SSD, HD7950, Mackie Blackjack
#1

5 Replies Related Threads

    scook
    Forum Host
    • Total Posts : 24146
    • Joined: 2005/07/27 13:43:57
    • Location: TX
    • Status: offline
    Re: Setting up an appegiator plugin in Sonar 2015/12/27 09:59:38 (permalink)
    Paul P
    Could someone please explain to me why this audio track is needed ?

    I can only guess.
     
    The plug-ins mentioned are implemented as VSTi's. In SONAR virtual instruments use a combination of audio and a MIDI tracks or an instrument track.
     
    If these were MFX plug-ins, they could be placed in the MIDI FX rack. Cakewalk provides such a plug-in.
    #2
    tlw
    Max Output Level: -49.5 dBFS
    • Total Posts : 2567
    • Joined: 2008/10/11 22:06:32
    • Location: West Midlands, UK
    • Status: offline
    Re: Setting up an appegiator plugin in Sonar 2015/12/27 10:08:49 (permalink)
    Kirnu's arpegiator is a VST is it not?

    I don't have either of the plugins you mention, but this is how I would expect kirnu to need routing in Sonar.

    VSTs can only be loaded into audio tracks because VST is an audio plugin protocol, not a MIDI one. Confusing, I know.

    Firstly, I recommend inserting synths using seperate MIDI and audio tracks, not a combined 'instrument track'. The two track approach makes getting thjngs like this working much easier.

    Because is is a VST, kirnu needs to be loaded as a synth, which means in an audio track. To get MIDI to kirnu you need a MIDI track with its' output pointing at kirnu.

    The MIDI from the controller provides the input to kirnu's MIDI track.

    To get MIDI out of kirnu you need to insert it from the synth rack browser and make sure that the check box for kirnu to be allowed to output MIDI is enabled. This can be a bit buggy unfortunately...

    The synth that makes the sound is then inserted into the project as an "ordinary" synth. That synth's MIDI channel's input drop-down should be set to point to kirnu.

    Assuming the MIDI channels are set correctly, that should do the trick.

    So:

    Controller ->MIDI track 1 -> kirnu (in audio track 1).
    Kirnu MIDI out -> synth's MIDI track -> synth's audio track(s).

    As scook says Sonar has its own arpegiator as part of any MIDI track's available functions. You can access it through the MIDI track's inspector view. Sonar's arp is pretty good, far more complex than most built into synths.
    post edited by tlw - 2015/12/27 10:22:21

    Sonar Platinum 64bit, Windows 8.1 Pro 64bit, I7 3770K Ivybridge, 16GB Ram, Gigabyte Z77-D3H m/board,
    ATI 7750 graphics+ 1GB RAM, 2xIntel 520 series 220GB SSDs, 1 TB Samsung F3 + 1 TB WD HDDs, Seasonic fanless 460W psu, RME Fireface UFX, Focusrite Octopre.
    Assorted real synths, guitars, mandolins, diatonic accordions, percussion, fx and other stuff.
    #3
    Paul P
    Max Output Level: -48.5 dBFS
    • Total Posts : 2685
    • Joined: 2012/12/08 17:15:47
    • Location: Montreal
    • Status: offline
    Re: Setting up an appegiator plugin in Sonar 2015/12/27 23:29:04 (permalink)
     
    Thanks for the help, I'm undertanding things better now.  I wonder why Kirnu and BluARP are set up as vsti's and not midi fx's.  That's certainly a source of confusion, especially since they won't work without the audio connection of a vsti, even though they don't use it.  It would make more sense for them to be midi fx (or instrument), which is sort of what I thought they were.
     
    I don't think I can get my mind around putting a synth into an audio track.  Or putting a midi fx into an midi track.  To me a track (or a clip) is just a container to hold musical content (like a line of ferrite in an audio tape).  But in Sonar, tracks also connect various components together and house fx racks and sub-containers called clips (which house their own fx racks).  I would really prefer to consider the components (synths, fx, and racks thereof) as being outside of the tracks and clips, with the inputs and outputs of the tracks (given the lack of 'cables') connecting the components together like it was hardware.
     
    It's interesting that the Synth Rack stands on its own (at least visually) whereas fx racks are displayed 'within' tracks and clips.  I imagine this is more a matter of graphical convenience than of conceptual rigour.  But it's confusing to have such a harware-inspired UI also present things you'd never see in hardware.
     

    Sonar Platinum [2017.10], Win7U x64 sp1, Xeon E5-1620 3.6 GHz, Asus P9X79WS, 16 GB ECC, 128gb SSD, HD7950, Mackie Blackjack
    #4
    tlw
    Max Output Level: -49.5 dBFS
    • Total Posts : 2567
    • Joined: 2008/10/11 22:06:32
    • Location: West Midlands, UK
    • Status: offline
    Re: Setting up an appegiator plugin in Sonar 2015/12/28 23:12:38 (permalink)
    Personally I can't think of a DAW that isn't organised like or almost like Sonar's approach. And it's a very hardware-like approach as well.

    Think of each track as a channel strip in a giant mixer. The track header and inspector are the mixer and its insert points and the audio and midi clips represent the content of the multitrack recorder and sequencer they feed. Busses are the equivalanr of thr mixer busses that are used for mixdown and feeding effects that operate on sends from channels rather than in the channel inserts. Such as a reverb plugin which is fed by track sends.

    The interconnecting cables are the channel and bus sends and the channel/bus outputs. Rather handily, and unlike hardware, we can have as many sends per channel as we like.

    Unlike in hardware, all the stuff patched into a channel as inserts - eq, prochannel, fx bin - are conveniently placed together so you can see at a glance which feeds into which and what is going on. Same with the busses, effects placed there also are shown together. And you can have as many channels and busses and effects units as your computer can stand.

    Hardware may have all had to sit in racks, with everything labelled up with what it's connected to and so on, but personally I'd find having to sort through a single "virtual rack" of maybe 50 or more eqs or a similar number of compressors plus a large number of other plugins, all the time needing to correctly identify the one that relates to the channel I'm dealing with, a nightmare. A modern DAW can be running more processors than even the best equipped pre-digital studio and mistakes would be far too easy to make. Even more so when you start adding say two identical compressors into a single track's virtual insert points...

    I hardly ever use the synth rack either, but that could just be because I tend to use hardware synths so don't usually have more than two or three plugin synths active.

    As for why VST and not MIDI effects, as I understand it there's no one standard for MIDI plugins, which means they are application specific, one that worked in Sonar might well be useless in Cubase etc. VSTs however can work in any DAW that supports the VST standard, which all PC DAWs do. Since the VST standard allows plugins to receive and send MIDI it's the simplest way to do things.

    Again, I also suggest you take a look at Sonar's built-in arp and the huge number of patterns it comes with. You can also create your own arps by using the piano roll or step sequencer and just drawing the required notes in.

    Sonar Platinum 64bit, Windows 8.1 Pro 64bit, I7 3770K Ivybridge, 16GB Ram, Gigabyte Z77-D3H m/board,
    ATI 7750 graphics+ 1GB RAM, 2xIntel 520 series 220GB SSDs, 1 TB Samsung F3 + 1 TB WD HDDs, Seasonic fanless 460W psu, RME Fireface UFX, Focusrite Octopre.
    Assorted real synths, guitars, mandolins, diatonic accordions, percussion, fx and other stuff.
    #5
    Paul P
    Max Output Level: -48.5 dBFS
    • Total Posts : 2685
    • Joined: 2012/12/08 17:15:47
    • Location: Montreal
    • Status: offline
    Re: Setting up an appegiator plugin in Sonar 2015/12/29 01:57:53 (permalink)
    tlw
    Think of each track as a channel strip in a giant mixer. The track header and inspector are the mixer and its insert points and the audio and midi clips represent the content of the multitrack recorder and sequencer they feed. Busses are the equivalanr of thr mixer busses that are used for mixdown and feeding effects that operate on sends from channels rather than in the channel inserts. Such as a reverb plugin which is fed by track sends.



    Wouldn't it have been better if tracks in Sonar were called channels like in a real console ?  This may be another thing contributing to my confusion.  Then clips could have been tracks (and clip fx would be the equivalent of fx built into the tape deck ?).
     
    I realize I'm picking nits, but I'm doing so from a desire to understand.  I can more or less use Sonar as intended whether or not the way it's built make sense (and I'm not saying it doesn't).  It's when I went to do a slightly more complicated routing scheme that I realized that I didn't really know how things were connected and moving around inside, and what the overall philosophy was.  To understand that, I had to first learn the terminology, then the architecture, and it wasn't quite what I expected (though that doesn't mean much since I have very little prior recording/mixing experience).  It didn't help that the arpeggiators I'm playing with are built to conform to constraints I know nothing about .
     
    Things are much clearer now, though.  And I've learned that vsti's must have an audio track associated with them, whether they actually output to them or not.  I've also become acquainted with both the built-in arpeggiator and the Cakewalk-fx arpeggiator plugin.  Thanks tlw.
     

    Sonar Platinum [2017.10], Win7U x64 sp1, Xeon E5-1620 3.6 GHz, Asus P9X79WS, 16 GB ECC, 128gb SSD, HD7950, Mackie Blackjack
    #6
    Jump to:
    © 2024 APG vNext Commercial Version 5.1