This new Google venture is dubbed Project "Soli" (no idea why this may be... no one seems to have questioned it... just go with it... Google are geniuses).
The concept behind the project was to turn the user's hand and fingers into an interface, which, through certain hand gestures, is able to interact with various wearable and internet-enabled devices.
Although science fiction has been imagining this kind of virtual interaction for years, this may be the first attempt to really bring the concept into the real world. The magic, lies in radar signals.
Google have recognized that touch-based interfaces (which include Android Wear and most others wearables) require a lot of display space in order to ensure users with relatively over-sized fingertips (i.e. fat people) can reliably hit the touch target.
Their solution to this problem: let the fingers do the work themselves! If Project Soli becomes productized, we'll all soon be interacting with our devices and applications wirelessly and intuitively, like some kind of futuristic wizard race.
Here's the video, for the non-believers:
P.S If anyone can tell me why it's called "Project Soli", send me a letter via Project Wing.