Don't take it personally. I am really excited about touch. I work as a mobile tablet programmer for my day job.
The question for me is, 'Will it provide me something better than what I have now?'.
Touch to be silent so I can work the computer while performing or recording in the same room is useful to me now in Sonar.
For an envelope editor or in PRV, it has to beat my current workflow. I didn't think the touch functionality shown in the video you posted would beat my current workflow.
I'm not being judgmental because it's a different product. I use other DAWs as well as Sonar, depending on my current need. My point was lost in my negativity, I think. Sorry about that.
The point was that even when you focus on something small, like the fly-out EQ, it is still possible to struggle with the touch implementation. I described how changing the Q with touch for me was hard in Sonar despite the fact that it is touch enabled to do so. I don't mean with the Q knob either. This is the work-around I use, using the Q knob. but you can put both fingers on the node and spread and contract in a certain way to change Q.
As a user, I haven't become good at it. It's a trick. I practice using it sometimes but usually return to the Q knob.
So, my point is that by trying to touch enable an entire DAW immediately, you end up with lots of cases like Q in the fly-out EQ where it is technically touch enabled, but not in such a way that I would use it.
That was my read on the Slate. That, to me, is what the guy is talking about at the end of the video. The implementation requires a large amount of touch and pause. It's stuff you could get used to, but why? When you are mixing, you can make all the mouse noise you want. You can clackity-clack on the keyboard. Why take your hands off the fast tools that work to move to the slow, imprecise ones?
The learning curve with touch is 2-way. It is both on the part of the users and on the part of the developers. I'm getting better at using touch, as the Sonar developers are finding ways to make it useful.
I look forward to a more complete touch implementation. I kinda suspect that there are branches of the app that have significant touch gestures implemented and that the team tries it out. Craig will own that he has to be pushed down the touch road a little. The guys are like, 'Hey Craig, tried it?' ... 'Uhhh I'm gonna!' :-)
So, Craig gets a build maybe that has some increased touch functionality and plays around and they watch him as he leans back again and puts his hand on the mouse. :P (just a guess here guys).
When I use touch, it is usually my first move after not touching the DAW for a bit. Maybe because I was playing guitar. Then I reach out, make some changes and I'm good. Maybe I reach out, make some changes and switch to the mouse. How often do I move back to touch from the mouse? When? For what?
Touch is great for having guests in the studio, but they aren't going to know complicated gestures or buttons to tap. They tap and use... Synths... Th3... Console View.
An example of something that Sonar does really well is the virtual keyboard you can play using touch. That thing is awesome. Compare that with the Slate video where the guy was tapping tiny keys totally without control. The Sonar keyboard is really awesome.