Microsoft – Gesture Control

by | Jun 28, 2016 | Home-blog, News

Microsoft Pushing Gesture Control

Almost every object you encounter day-to-day has been designed to work with the human hand. So it’s no wonder so much research is being conducted into tracking hand gestures to create more intuitive computer interfaces. Such as Purdue University’s DeepHand or the consumer product, Leap Motion. Now Microsoft has outlined some projects that deal with hand tracking, haptic feedback and gesture control.

“How do we interact with things in the real world?” asks Jamie Shotton, a Microsoft researcher in the labs at Cambridge, UK. “Well, we pick them up, we touch them with our fingers, we manipulate them. We should be able to do exactly the same thing with virtual objects. We should be able to reach out and touch them.”

The researchers believe that gesture tracking is the next big thing in how humans interact with computers and smart devices. Combining gestures with voice commands and traditional physical input methods is not new. Better touchscreens and keyboards will allow ambient computer systems, such as the Internet of Things devices, to better anticipate our needs.

The Challenges

The first hurdle is a big one: the human hand is extremely complex. And tracking all the possible configurations it can form is a massive undertaking. That’s the focus of Handpose, a research project underway at Microsoft’s Cambridge lab. They are using the Kinect sensor you’d find packaged with an Xbox console. it’s tracking a user’s hand movements in real-time to display virtual versions that mimic everything real hands do.

The tool is precise enough to allow users to operate digital switches and dials. All with the dexterity you’d expect of physical hands, and can be run on a consumer device, like a tablet.

“We’re getting to the point that the accuracy is such that the user can start to feel like the avatar hand is their real hand,” says Shotton.

Another key aspect to the sensation that digital hands are really your own comes through the sense of touch. And users of Handpose’s virtual switches and dials still reported feeling immersed without any haptic feedback. But a Microsoft team at Redmond, Washington, is experimenting with something more hands-on.

Microsoft’s Solution

First, this system is able to recognize a physical button. (Even when not connected to anything in the real world.) And it knows it’s been pushed by reading the movement of the hand. Using a retargeting system allows multiple, context-sensitive commands to be laid over the top in the virtual world.

So, this means that a limited set of virtual objects on a small real-world panel is enough. Enough to interact with a complex wall of virtual knobs and sliders, like an aeroplane cockpit for example. Finally, the dumb physical actual buttons and dials help make virtual interfaces feel more real, the researchers report.

The third project comes out of Microsoft’s Advanced Technologies Lab in Israel. The research on Project Prague aims to enable software developers to incorporate hand gestures for various functions in their apps and programs. So, miming the turn of a key could lock a computer, or pretending to hang up a phone might end a Skype call.

Basically, the researchers built the system by feeding millions of hand poses. They go into a machine-learning algorithm to train it to recognize specific gestures. And uses hundreds of micro-artificial intelligence units to build a complete picture of a user’s hand positions. Plus, their intent. It scans the hands using a consumer-level 3D camera.

In addition to gaming and virtual reality, the team believes gesture control has applications for everyday work tasks. And this includes browsing the web and creating and giving presentations.

Credit: Gizmag / Microsoft Blog

Share On