Rise of gestures and touch-free input
For years, smartphones have included some sort of indirect input, like silencing sound when you flip the phone over, or dialing with your voice. Taken in a wider context, gestures and voice comprise a wider world of multimodal input -- basically, anything that isn't your finger tapping at buttons on the screen.
A lot of smartphones include some sort of optical or physical gesture to perform a set of tasks.
(Credit: Sarah Tew/CNET)
Gestures and voice may be starting the trend, but other, more sophisticated transitions and input methods will soon move from wacky option to normal ways of interacting with devices. For instance, calling up the Galaxy S4's S Voice Drive app already assumes you'll be speaking instructions rather than typing. On a drawing program, using your finger or a stylus may be the best way to go about.
What if launching a game automatically activated eye-tracking sensors for enhanced play, and what if tapping the phone to an NFC receiver on your car turned on motion control that let you mute or amplify volume on a phone call or the radio with a wave of your hand?
Then there will be the apps or tasks that will seamlessly switch from manual to voice to text to gesture-based inputs depending on what the app is and what you're doing. Let's say you launch a fitness app by tapping. When it senses motion, it switches to voice commands. At the end of the exercise when you're weak and uncoordinated, you could wave your hand above the screen to drill down into stats.
In Leap Motion's computer demo, the controller is you.
(Credit: Screenshot by Jessica Dolcourt/CNET)
What if, in order to authorize a payment, you've set up your phone to bump the device against a surface while simultaneously giving a voice command?
In addition to gestures we already know about, like bumping and waving, companies are hard at work adding tracing and body movement to the mix. There's Microsoft Kinect, of course, a gaming console that uses your moving body as the controller. In a more limited vein, Samsung has extended its software for recognizing a hovering stylus, to the human finger. On devices like the Galaxy S4, you can now point your finger right above the screen to preview a browser tab, photo, or video.
Leap Motion is another company working with gestures this way. The standalone sensor, imagined for use with a large screen monitor or TV, responds when you pinch your fingers for a zoom, point, draw, or trace. This is the kind of sensor that could easily find its way into a smartphone or tablet. (Click the link above to watch the YouTube video demo. The Leap box comes out next month.)
No comments:
Post a Comment