Interaction that feels human.

Interaction hero image Interaction hero image

As technology becomes ever more present in our lives, it’s fair to start asking technology to take a few more cues from us. That is why we believe that natural interactions, such as body movements and gestures help you interact more seamlessly with everyday objects around you —and are a key catalyst towards a vision of ambient computing.

In service of this vision, we created a new interaction paradigm based on the nuances of human movement—from body language, gestures, and even submillimeter motions, such as heartbeat—all with the help of a miniature radar chip called Soli.

Our ultimate goal is to have computers understand us a bit closer to the natural ways we understand each other.

From implicit to explicit interactions.

We’ve built a set of interactions using Soli according to levels of proximity and engagement between a person and a device. Soli interactions exist on a continuum between 1) the passive sensing of people’s spatial relationships (implicit interactions) around the device, and 2) the active sensing of direct (explicit) gestural interactions.

  • Implicit
    Play icon


    Soli is aware of nonverbal aspects of people such as their proximity, body language and even biosignals like heartbeat and respiration around a device.

  • Explicit
    Play icon


    Soli is aware and responsive to your explicit gestures around a device including full hand movements or microgestures using just your fingers.

A library of movements.

The Soli movement library consists of units of interaction which can be used to build complex interactive experiences that go beyond touch and voice.

Soli is aware of the location of the people in its sensing area, just like you’re aware of someone entering the room you're in.

Presence Gesture movement
Play icon
Multi-user Presence Gesture movement
Play icon
Multi-User Presence
Sit Gesture Movement
Play icon

Soli recognizes the body cues that typically start and end interactions, just like you recognize when someone establishes eye contact with you.

Lean Gesture Movement
Play icon
Turn Gesture Movement
Play icon
Reach Gesture Movement
Play icon

Soli responds to articulated hand and body gestures — both micro and macro — that you use to perform specific tasks, such as selecting, manipulating and navigating content.

Dial Gesture Movement
Play icon
Slider Gesture Movement
Play icon
Swipe Gesture Movement
Play icon