Hi Noel,
Thanks for chiming in. Prehaps you could clarify the direction of Sonars touch features.
I recall when X2 came out. James Oliver demonstrated the touch features on a Sonicstate video.
He said something along the lines of "To do any editing in Sonar you will always need to use a mouse and keyboard"
Is that how you see the development of Sonar touch?
That would be the nub of why I feel the development of touch as being a bit odd. Wouldn't the logical goal of a touch interface be the full use of the program, rather than relying on a external item of hardware to edit anything?
However there is something you said that I dont fully understand, but might be at the root of my frustration.
"To handle multitouch we cannot rely on the base windows fallback since that doesn't support multitouch"
Does this mean that implementation of multitouch is mutually exclusive with windows touch right click emulation?
So you can add multitouch features like the on screen keyboard, but at the expense of having a right click touch menu in the arrangement window?
That means that the things that I had assumed to have been crippled deliberately, are actually just the result of implementing multitouch. And I'm just woefully underestimating how long it will take to make Sonar a complete multitouch program. Many many years.
Sonar Unobtainium edition?
Is this the case?
Back on topic. If Sonars step sequencer allowed automation of vst parameters rather than being limited to midi spec messages only, or the matrix allowed for direct editing of cell contents, it really would be an improvement on P5. These are still superior in P5, especially if you are using a touchscreen.
Cheers J