• SONAR
  • ACT Midi Controller - still pulling my hair out (p.5)
2017/10/25 13:17:50
pwalpwal
azslow3
Can you control mentioned space shuttle from your MIDI controller? Or can you at least answer a phone call from your MIDI keyboard?  Have you already asked Google/Apple to give you a "template" to control your phone from arbitrary device?



that's a ridiculous
2017/10/25 15:02:27
azslow3
pwalpwal
azslow3
Can you control mentioned space shuttle from your MIDI controller? Or can you at least answer a phone call from your MIDI keyboard?  Have you already asked Google/Apple to give you a "template" to control your phone from arbitrary device?



that's a ridiculous

Why? Buttons and knobs can be used for everything. MIDI controllers just convert the facts user operate them into MIDI messages.
 
These MIDI messages are not more "prepared" to be used to control a DAW than to control a phone or a rocket. We indirectly use many types of such "conversion": we use mouse to move faders, we use on-screen keyboard touch to play VSTi. The original intention of that "devices" was not music, but we can use them for that. Reversed is also true, we can use MIDI keyboard not only to play notes, but also to call commands in Sonar (MIDI Key Bindings) or perform almost any other action (using ACT API). And we can write a program which does something completely unrelated to the music, f.e. answer a phone call (at home I have VOIP, so I can answer calls on computer and so I in fact can easily do this!). But for all that A CONVERTER is required.
 
My point is that writing A CONVERTER which allows control a DAW from a MIDI controller is not simpler than writing A CONVERTER to control a phone from the same device. There can be no "easy and intuitive" tool to allow unaware user create such converter from scratch, as there is no "easy and intuitive" guitar to play good music.
 
For long time I try to explain that:
1) all controllers are almost the same from computer perspective. It is possible to add Mackie Control in Sonar (my moded version to "disable handshake"...) and instead of MCU connect normal 88key keyboard with PitchBend to it.
The keyboard will start to mute/solo tracks, pitchbend will start to control volume, etc.
It is complicated software (2.5k C++ lines for Mackie, more for VS700) which make "Mackie Protocol" do something smart in Sonar.
 
2) but buttons/knobs/faders on different devices, apart from looking the same, can produce different sequence of messages. There is no "universal way" to auto-detect all possible cases. Only a person, with proper background to understand what is going on, can organize things correct and optimal.
 
(1) and (2) can sound mutual excluding, but they are not.
2017/10/25 15:13:35
pwalpwal
sure, it's what i'd call "mapping" (you're calling CONVERTER above) but it's ridiculous to suggest that mapping music software controls to a music hardware device controls (where the interfaces, if not standardised, are at least in the same ballpark) is as simple/complex as/comparable to mapping it to the spaceshuttle's controls - it was the extreme analogy fail that i was commenting on, not that it's still a complex activity mapping midi kbds to sonar
 

2017/10/25 19:11:27
azslow3
I work with this "toy":

The Control Room for the thing looks like a launch control center
© 2025 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account