Google has unveiled an interaction sensor that uses radar to translate subtle hand movements into gesture controls for electronic devices, with the potential to transform the way they’re designed (+ movie).
Project Soli was revealed by one of the developments in Google’s Advanced Technology and Progress (ATAP) group during the company’s I/O developer conference in San Francisco.
The team has designed a tiny sensor that fits onto a chip. The sensor is able to track sub-millimetre hand gestures at high speed and accuracy with radar, and use them to control electronic devices without physical contact. This may possibly remove the necessity for designing knobs and buttons into the surface of products like watches, phones and radios, and even medical equipment.
“Capturing the possibilities of the human hands was one of my passions,” said Project Soli founder Ivan Poupyrev. A chip is used to emit at a target Waves in the radio frequency spectrum are emitted by the chip. The panel then receives the reflected waves, which are transferred to a computer circuit that interprets the differences between them.
Even subtle modifications noticed in the returning waves can be translated into commands for an electronic device.
“Radar has been used for many different things: tracking cars, big objects, satellites and planes,” said Poupyrev. “They are using them to trail micro motions; twitches of humans hands then use it to interact with wearable’s and integrated things in other computer devices.”
The team is able to extract information from the data received and identify the intent of the user by comparing the signals to a database of stored gestures. These contain actions that mimic the use of volume knobs, sliders and buttons, creating a set of “virtual tools“.
“Our team is focused on taking radar hardware and turning it into a gesture sensor,” said Jaime Lien, lead research engineer on the project. “The purpose why we’re able to construe so much from this one radar signal is because of the full gesture-recognition pipeline that we’ve built.”
Compared to cameras, radar has very high positional accuracy and so can sense tiny motions. It can also function through other materials, meaning the chips can be embedded within objects and still pick up the gestures.
The gestures chosen by the team were selected for their similarity to standard actions we perform every day. For instance, when we swipe over the side of a closed index finger by a thumb could be used to roll across a flat plane, however tapping a finger and thumb together would press a button.
Google’s ATAP department is already testing hardware applications for the technology, including controls for digital radios and smart watches. The chips can be created in large sets and built into devices and objects.