For those wanting to take the Harry Potter approach to controlling and interacting with your mobile device, there are some welcome developments on the horizon. Already there is the Amazon Fire Phone, which, although it hasn’t exactly set the … ahem … world on fire, does incorporate gesture control via its four forward-facing cameras. However, there is also Elliptic Labs, which seeks to make mobile devices gesture-capable via ultrasound.
Elliptic’s Touchless Gesture Recognition Engine is already implemented in a Windows 8 suite and an Android SDK. According to the company’s materials, “ultrasound signals sent through the air from speakers integrated in smart phones and tablets bounce against your hand and are recorded by microphones also integrated in these devices. In this way, Elliptic Labs’ technology recognizes your hand gestures and uses them to move objects on a screen, very similar to how bats use echolocation to navigate.” The system can capture gestures over a 180-degree field of view, with sensitivity to depth as well as position, giving it some advantages over camera- or IR-based solution, as well as the (not obviously useful) ability to work in the dark. Implementation by OEMs is now expected to follow in 2015.
So soon, you may be turning the page on your mobile reading device with a swish and flick as a matter of course.