With an updated SDK expected in a few weeks, Microsoft’s Kinect will incorporate hand gesture recognition which will allow for more sophisticated interactions like pinch-to-zoom. We’ve seen this functionality with the much-hyped Leap Motion but Kinect is far and away the leader in bringing gesture controls to the mass market.
In the past few years we’ve seen interfaces becoming more natural, incorporating gesture and speech controls and Samsung’s latest phone will continue that trend. In fact, the Galaxy S IV is expected to leverage eye tracking technology to affect page scroll based on where users are looking. We’ve seen this technology in action from companies like Tobii and Gazehawk but the incorporation with Samsung takes it into the consumer market.
When this video of Leap Motion’s motion sensing technology came out, there was plenty of buzz from the tech community and rightly so. Based on the demo, Leap Motion appeared to be a game changer for gesture controls, enabling a level of sophistication previously reserved for the keyboard. But how would it perform under real world settings? What are some of the applications? How will it affect UI? While many of these questions are largely unanswered–Leap Motion is still only accepting preorders–we now know that they are looking for mainstream adoption thanks to a recent partnership with Best Buy. According to Techcrunch, the startup will now be selling their Leap controller on their site and Best Buy in addition to shipping it with new Asus PCs in 2013.
The Lab has utilized eye tracking to measure attention but startup PredictGaze is taking that a step further by using the technology as way to interface with TVs, iPads and more. Predictgaze’s software leverages everyday webcams to let users control various devices through their eye movements alone. While we haven’t seen the technology in the wild, the findings from the demo look pretty impressive.