I don't think we've reached the point where touchscreen technology is optimised, either soft or hardware. On the hardware front, touch is simply not good for some operations. vol faders is a good example, where the tactile feedback and resistence is part of the engineering experience. And knobs. It is completely natural to twist a knob. But how do you twist a virtual knob. Most mouse knobs you travel up and down. So the software could replace all the knobs with sliders. Or Cake (or some other enterprising firm) could piece together a touchscreen w/ physical knobs to the side (or less ergonomic at the bottom/top). Touch the screen knob and use the physical knob to adjust the parameter. The same w/ a vol fader. A real motorized fader - touch the channel fader and the phsysical fader gets focus and fade away ...
That is step one. But software integration - writing the software for touch, is even more important. There are a lot of jobs that would be easier in touch than mouse. Scrolling - what is quicker, grabbing a small button or arrow off to the side or simply swiping up or down and touching to stop? What is easier to expand the view - pinching or grabbing another small bar and sliding it right or left. I'll be interested in seeing how well X2a and Win 8 makes navigating a project in all directions and changing sizes. And once you focus how well it opens up the focus - like an eq or synth. Touching for toggle and resizing your screen ought to be easy, but how easy editing is is another question.
I don't much like paying for apple products and Steve Jobs is dead, but from what I've heard from people he was a real as ... tough making the designers design for dead easy manipulation. So easy even a bass player could understand and adapt to the product.
touch screen is coming, but it will be a supplement to the things we already know how to do well. It should help speed up engineering where it can, and leave knobs and vol faders to do the things those do best.