• SONAR
  • what comes after touch
2013/02/07 07:49:11
gswitz
http://www.pcworld.com/article/2027393/windows-9-how-microsoft-might-overhaul-the-interface-in-its-next-os.html
 
So, I doubt any of you will care, but I think this is cool and worth sharing for conceptual purposes.
 
Remember the Kinect Sensor designed for the Xbox and mass marketed to make robotics sensors cheaper? Well, what if it were built into your monitor and you could interface your applications by moving your hands in 3D.
 
Watch the little video on the link above.
 
I think this is a very interesting concept, for robotics, for computing in general.
 
The Kinect sensor has been used on robotic helicopters to make it possible for the helicopter to navigate a room with moving people in it.
 
Enabling touch gets the OS Ready for multi-point input. Sonar is taking advantage of this.
 
Anyway... not sure what the developers might be thinking here, but it might not be a great leap from touch to ... what? ... not-touch?
 
G
2013/02/07 09:45:00
SuperG
I think there's opportunity for various sensors to be used in musical performance, but a lot depends on how you view these sensor should be used. For editing purposes, as in DAW, there's plus and minuses.

For performance purposes, I supposes you can use the additional sensors now common in tablets such as orientation and acceleration as instrument inputs, but these would be mostly analog in nature rather than discrete - sorta like the difference between a theremin and a keyboard.

As for editing, touch is it for now. Kinect is really cool, but it really doesn't have enough fine resolution for absolute control. Gross gestures work. but it can't tell if you've raised you're left pinky or not, not yet. Someday though, I imagine we'll see *real* air-guitar and air-keyboard synth performances. No more simply pretending to be a cool super rockstar - you're gonna have to actually read music and practice! 
© 2026 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account