Qualcomm has designed a touch sensor for holographic phones

One of the best measures of a trend’s progress in the consumer electronics world is the reaction of the component suppliers. If the suppliers start taking concrete steps towards addressing it, then they anticipate demand from their manufacturing clients, who in turn must be anticipating demand from consumers. That’s why it’s such an encouraging sign that the mobile holography is apparently already starting to receive some attention at Qualcomm.

The communications giant is seeking patent protections for a gesture recognition mechanism that is able to track the motions your hand makes in front of a smartphone or tablet’s display when, say, interacting with a projected menu icon, using an array of miniature cameras spread along the periphery of the screen. Their small size limits the lenses to fairly low quality black-and-white capture, but that’s more than sufficient for Qualcomm’s purposes.

The company is not the first to try and provide visual gesture recognition in a mobile form factor, with Samsung having taken a shot at it more than two and a half years ago with the Galaxy S4. But two generations later, the South Korean giant’s flagship smartphone still limits users to a handful of basic motions that are only detected if performed a few centimeters from the screen. The reason for that lies in the fact that the specialized depth-sensing cameras normally used for hand tracking simply don’t lend themselves to mobile devices.

Even the most accessible time-of-flight variations carry a steep price tag that is exacerbated by equally uneconomic power and processing requirements. But the industry may have found the will to tackle those issues if not for the fact that the technology is usually unable to accurately capture anything closer than 15 centimeters, an unacceptable distance for mobile users. Qualcomm’s newly revealed gesture recognition system bridges that gap and then some.

The system substitutes its low-tech cameras’ lack of depth-sensing with a homegrown algorithm that decomposes images of the user’s hands into geometric abstractions from which depth information is then extracted using a statistical analysis technique known as non-linear regression. The results are fed into a second model trained to recognize about four million different gesture variations that converts the data back into a recognizable form that a mobile device is able to use in order to identify specific commands.


The system is thus able to track fingers at a distance up to 15 centimeters that doubles for more basic commands like swipes that only require determining the general direction in which the user’s hand is moving. That should make the technology equally useful for interacting with close-range holograms like kind of LG is working to implement in its future phones and everyday apps. No more blocking the screen with your hand when turning a page on Flipboard.

Image via StockSnap