Google: Project Soli,
Touchless Gesture Interaction Sensor

 

Google: Project Soli,
Touchless Gesture Interaction Sensor

 

Working with AllofUs, we were asked by Google’s ATAP (Advanced Technology and Projects) to help develop a new gesture-based device that utilises radar technology.

By making use of radar’s unique qualities, what became named Project Soli, can track sub-millimeter gestures at high speeds with incredible accuracy. It also has the added benefit of being able to be hidden under a range of different materials.

Recognising the potential of this technology, the ATAP team approached AllofUs to work with them on exploring and identifying real life user cases in order to demonstrate the capabilities across a large array of different applications.

With our experience and insights into how people interact with technology, in both the physical and virtual worlds, we presented a series of speculative scenarios in which near-future Project Soli products were being used. These included: travel, healthcare and leisure.

With the Project Soli hardware now built, the concepts developed by AllofUs and the ATAP team are being prototyped and brought to life.

I was responsible for the research and interactive concept development of this project.

The Soli Sensor

Prototype Collection

Prototype Collection

The Current Developers Kit

Gestural Input

Illustration of Soli’s radar technology

 

Interactive Principals

 

Through careful analysis of the unique technological capacities of radar, as well as the contextual considerations needed with such a technology we identified four key use cases: Generic Input, Context Sensing, Reactive Environment, and Fun Experiences. 

We then devised a set of interactive principals that would enable us develop user-product scenarios within the framework of the use cases. These principals centered on an interactive gestural language that could be integrated in, and tailored to, each specific context in which Project Soli was being used. Accordingly, the principals paid great consideration to the embodied nature of the user, their environment, and their situation. 

Radar Capabilities Diagram

Areas of Radar Integration

Flow of Interaction

Flow of Interaction

Wake Up Triggers

Wake Up Triggers

Gesture Styles

Gesture Styles

 Input Gestural Language

Input Gestural Language

Primary Input Gestures

Primary Input Gestures

 

Embodiment Stories

 

By developing the interactive principals we were able to devise a series of embodiment stories that allowed us to explore how Soli might be used within real life scenarios. These stories were framed around the development of three kinds of radar devices:

1: Radar as ‘precious device’, which enables its owner to access and interact with their networked digital products without the need to touch or even see them.

2: Radar in a device, which enables Soli to be embedded within an existing device, such as smartphone, allowing the device to sense the user’s context and activity, even through their clothes. App developers can utilize the sensor’s capabilities to create new experiences or provide new service features.

3: Radar Everywhere, whereby Soli is integrated everywhere and users are invited to interact with their surroundings via a small vocabulary of simple universally recognised gestures.

Embodiment Stories: Radar as ‘Precious Device’

Embodiment Stories: Radar as ‘Precious Device’

Embodiment Stories: Radar in a Device

Embodiment Stories: Radar in a Device

Embodiment Stories: Radar Everywhere

Embodiment Stories: Radar Everywhere

Product Interaction Envisioning (image by ATAP)

Product Interaction Envisioning (image by ATAP)

Product Interaction Envisioning (image by ATAP)

Product Interaction Envisioning (image by ATAP)

Product Interaction Envisioning (image by ATAP)

Product Interaction Envisioning (image by ATAP)

Product Interaction Envisioning (image by ATAP)

Product Interaction Envisioning (image by ATAP)

Functional Prototype of Harman Home Audio System (image by ATAP)