vladasyn
I asked Traction tech support how to set Midi Learn to make controller control knobs in BioTek, and guess what- they say- my Host DAW should handle it.
I do not think so...
I know there is that MIDI Learn icon in every plugin window, but I rarely use it.
I do see it... Which Sonar version are you using?
vladasyn
There is no Keyboard controller involved, only the mouse. I need to record mouse move. The only was I know to record mouse move is to activate automation.
A kind of yes.
To activate automation, you need to set which controller to record. By default it sat to Volume. In case of virtual XY Pad as this synth offers, it would take 2 parameters- let's say Reso and Cutoff. Is there a way to record both parameters?
The simplest method is arm ALL, just deploy the latest updated in case you have already installed Manchester (there was a bug in initial release).
But let me explain how all that works because "Automations" have nothing to do with "MIDI learn", at least not directly.
A VSTi has some parameters and own User Interface. On that user interface, VSTi can put some knobs, X/Y panels or whatever this plug-in wants to show to the user. By definition of VST standard, DAW is not aware what is going on in this interface not it automatically can see/control internal plug-in parameters. So, it is not possible for (any) DAW to implement "MIDI Learn" for arbitrary graphical element in VSTi interface.
VSTi normally response to MIDI events. Some of these MIDI events can be used to control parameters. Which events control which parameters is defined inside the plug-in, statically or by own "MIDI Learn". DAW has not influence on that. What DAW can do in that respect is map some MIDI events to some other MIDI events before sending them to VSTi. In Sonar, that is implemented by Drum Maps for Notes and MFX (MIDI FX) for any type of controls.
VSTi can also expose parameters as "Automatable". Which parameters and with which names are exposed is up to concrete VSTi. These parameters are named and have values in range 0. to 1. (floating point). In other words, they have absolutely nothing to do with MIDI.
Sonar let you record these parameters as automations, any number of them in parallel and independent what trigger the changes. That can be:
1) VSTi own user interface (VSTi inform DAW that these parameters are modified)
2) "MIDI Learned" controls within sonar, called "Remote control...", including:
2.a) automation Remote control, right click in the "box" inside opened Automation Lane
2.b) Remote control for Assigned Controls, read more about that there:
https://www.cakewalk.com/Documentation?product=SONAR%20X3&language=3&help=SoftSynths.13.html3) using Control Surface API methods, including:
3.a) direct control, for example user for Mackie Control (Universal (Pro)) and compatible devices
3.b) ACT Dynamic Mapping, used with other controllers. Check Chapter 40 in Sonar Reference Guide. In plug-in windows there is "ACT Learn" button, related to that method.
Apart from recording live, you can also just draw required changes in the Automation Lanes.
Also in Sonar it is possible to create MIDI Control Envelopes. They are represented in the track view and controlled the same way as Automations, but they function differently, they influence which MIDI events are sent to VSTi, the same way as Control Changes events in the clip.
I hope that has clarified a bit what is there...
post edited by azslow3 - 2016/01/31 12:21:03