So I wandered into Best Buy over the holidays and picked up a $300 HP 23tm multitouch touchscreen. I'd recently purchased the outstanding $49 "V Control Pro" app for my iPad, but found that the iPad doesn't have quite enough screen real estate for larger plugins (for example Diva). I figured, what the hell--despite the somewhat lukewarm reviews out there for touchscreens and Sonar (or any other DAW for that matter), I'd try out the touchscreen with Sonar X3. Mind you, I'm on Windows 7 so I can't even take advantage of multitouch yet...it's single touch all the way for me until I get the courage to plunge into Windows 8.
That said...WOW!! I set the touchscreen flat at about a 70 degree angle in a way so that I could still see my existing, conventional screen behind it, and at a slightly larger resolution than 1900x1080 I've got all of my plugins (including the massive Diva window LOL) available on what is effectively a massive 23" ipad touch controller. On most plugins the onscreen controls are plenty big--this is particularly true for softsynths that aren't overly complex (think Diva, Monark, or other relatively vanilla subtractive synths--on Monark in particular the knobs are almost half-dollar size), plus just about any effects plugin (most comps for example are just a few knobs). Parametric EQs, while a smidge more finicky, are a blast because you're just dragging the curves.
The bottom line is that it's made my plugins a blast to use--the touchscreen is almost like a dedicated controller that instantly morphs/customizes itself perfectly to whatever is on screen. There's an immediacy to everything that was lacking previously. It's enough of a qualitative difference that I'm finding myself using more of my plugins, more often (especially ones with clunky or daunting interfaces.) As to why the process feels so much better, I think it's a couple of things: 1) you can set the screen resolution so that the plugins fill most of the screen and 2) there are actually far fewer physical steps to a touch screen than mouseclicks (if you think about it, using a mouse to adjust a plugin parameter arguably takes five discrete steps: look down and move your hand to the mouse, look up and move the mouse pointer to the plugin control, press the mouse button, drag the control, release the mouse button...by comparison a drag of a finger along a touch screen feels like only one or two steps).
Granted some aspects work better than others --I'm not tossing my mouse entirely mind you, and (like Sonar) I kind of feel that touchscreens should augment (rather than 100% replace) one's existing controllers. Some examples:
- I'm not aware yet of a way to simulate right-clicking, which is important on some plugins (for example, Massive with its very tiny boxes for modulation routings--this still requires a mouse right-click).
- "tapping" very tiny on/off controls or selection buttons takes some getting used to (at least on my particular touchscreen with Windows 7 single touch...maybe it's different on Windows 8). I've gotten much better at it once I figured out that it's more of a very slight "swipe" motion rather than a tap (again, this is single touch we're talking about here).
- there are some actions that still make more sense with a mouse. If you're scrolling through softsynth presets, continuously clicking in the same spot (or pressing the same button on a midi controller for patch change) makes more sense.
- In single touch, there are some seemingly random touch actions that seem unavailable, presumably due to the app you're trying to modify. For example in Sonar (and again this is *single* touch...i'm guessing this is resolved in multitouch based on YouTube vids of Sonar in Windows 8), I can't drag individual faders in the console view or the Inspector window...instead I end up dragging the entire console or inspector up and down.
But the bottom line here is that (for me anyway) my workflow feels already vastly improved...and this is under less than ideal circumstances (still getting used to it/Win7 and single touch rather than multitouch) on a technology that's still evolving. Based on even just this initial experience, it's hard to imagine touchscreens not having a pretty massive impact on how people work in the future. I could see touchscreens wreaking complete havoc with the controller market (particularly the low- and mid-range options, i.e.e everything below the Icon/Nuage range). I say this as a PT HD owner with a large and oft-used Control 24 sitting in my project studio. I love tactile knobs and faders as much as anyone, but you can already get 23" touchscreens for a relatively disposable $200 (Hannsprees are at this price), and they have the advantage of being essentially instantly and perfectly customizable to whatever is on screen. Plus they're "vendor-agnostic" (unlike my Control 24, which will soon be a doorstop once I have to move off of PTHD 10). I could easily see myself in a couple of years without any major hardware controller, but rather with 2 (or more!) oversized touch screens, one for my console view and one just for plugins.
Anyway I just thought I'd post in detail about this, since when I researched the topic on google it seemed like there was more speculation and opinion rather than people actually trying touchscreens and reporting back on what did and didn't work. I also suspect that some of the online reports are a bit dated -- for example I'm using X3 and a $300 23" touchscreen, neither of which may have been available a year ago (or six months ago for that matter).
Would love to hear from others working in Sonar with touchscreens (in particular, with Win8/multitouch, since that's my next move :)