Googles Project Soli- Controlling Gadgets With Hand Gestures

Googles Project Soli- Controlling Gadgets With Hand Gestures

In this highly tech-savvy world today, changes in technology happen in the blink of an eye. With new gadgets and technology being created and updated at a fast pace, every digital/technology company are kept on their toes to make breakthroughs.

Googles Project Soli- Controlling Gadgets With Hand Gestures

Project Soli- By Researcher Ivan Poupyrev

Google, the leading innovator in this field has solidified its stance with yet another redefining idea – Project Soli. Researcher Ivan Poupyrev and his team at Google have come up with a game-changing idea. Users can now forget the concept of swiping and using fingers on the gadget screens. Google has now developed a unique ‘gesture technology’ that can work even on the smallest gadget screens. Project Soli can identify finger movements with the help of inbuilt radar microchips.

Radar Based Motion Sensors

The search engine giant was granted approval for the radar-based motion sensors that would be working on higher power levels. This was granted to the company by the Federal Communications Commission on December 31st2018 (Monday). Although this was initially opposed by Facebook, for the reasons of how this might cause glitches in existing technologies, both parties also reached a consensus on the same. The approval was granted on the grounds of Public Interest and since the power level after compromises and changes cannot cause any major damage to any source or person.

Also Read: Google Has Updated Its Street View Trekker

How Radio Frequency Spectrums Is Used To Track Micro Motion

The rubrics of this technology, as explained by the mastermind Russian scientist are the ‘usage of radio frequency spectrums to track micro motions of the human hand.’ While radars have long been used to track objects and satellites, they are now being put to use and tracking human gestures. The radar readings are used with wearable and gadgets. The radar technology uses a ‘gesture recognition pipeline’ to intercept the hand gestures from the receiver and processing it into hand signals. These signals were based on the ‘human intent’ gestures that are usually used while doing a certain physical activity like turning the volume up or down, pressing a button between your index and thumb finger, or setting the time on an analogue watch.

Welcome To Project Soli :(Video Source: Google ATAP)

With much talks and buzz happening about this project, competitors, stakeholders, and the tech-savvy customers are waiting for more news from Google about the same. The official launch and talk is yet to happen since all that we have seen so far are news and a video released by Google explaining Project Soli.

About Meghna

Meghna Gopal is an undergraduate student pursuing her degree in Bachelor of Business Administration (Honors) from CHRIST (Deemed to be University), Bangalore. Meghna has a passion for writing, reading and loves to express her views about topics and issues. She is the President of the Debating Society of her college and is also the Chief Content Associate of her college magazine. She has worked with several different companies as a content writer. “Do what you love, Love what you do!” is the quote that she strongly abides by. If you want to get in touch with Meghna please mail at: [email protected]

View all posts by Meghna →

Leave a Reply

Your email address will not be published. Required fields are marked *