So why do we not have midi controllers templates loading like a VST or FX within DAW's for configuration if its easy to get a value from an encoder, irrespective of brand?
Why cant I go pull up a picture of my controller and click on each knob with my mouse to configure its range or increment type if necessary, then go click on the corresponding control in my DAW to link it, whether that be sonar, cubase, or any other brand? why cant i be using half of my midi controller in one DAW, and the other half in another if I so desired for whatever weird reason?
And if I want a control to do different things depending on the current focus window, then have a check box to lock it o focused window, and a mode switch for switching focus?
With system like that, then I could link two controls to compound values or change incremental steps on one encoder from another on the fly if I wished?
If controllers are sending standard data types from each control, I really do not see why this should be overly complicated to do, especially not in comparison to some of the other heavy mathematical number crunching stuff that some of these VST and FX do, or even functions like quanta-sizing or sample stretching.
Most consumer end products for the home studio are now USB, so it does not really matter if the commands from the mouse or controller, and most people only have 10 fingers, so its not as if they are going to be pressing 88 keys, on aftertouch, while rotating/sliding 50 controllers all simultaneously, and even if they were, I think USB could handle the data volume, so without diving into tech stuff, I do not see why data has to be processed on the controller side of things before getting to the DAW. I know some devices have biderectional requirements for things like motorised faders and LCD screens, but again, I am pretty sure USB can cope.
So I really do not understand why I need to be looking at configuration tables, instead of a pretty GUI template, supported by hardware plug n play. At least not on Windows 10.x