Sensum Logo

AI with a Human Touch

Real-time, unified data modelling for body, emotion & behaviour

See the Tech

Mobility

Human-reactive cockpit.

Learn more

Emotion AI

Meet Synsis: algorithms for human understanding.

Learn more

Chip

Embedded solutions.

Contact Us

Our universal AI engine, Synsis™, uses biometric data, voice and facial coding for modelling human emotions, behaviour and physiology.

connect

Measure human data

Out of the box, Synsis interprets emotional signals from physiological data (eg. heart rate, breathing, skin conductance), facial expressions and voice patterns.

connect

Analyse the environment

Plug in other data streams from the surrounding context for more accurate and nuanced processing of human data.

connect

Identify human states

One unified AI system measures a wide range of emotional, behavioural and physiological states, modelling your users' experiences moment by moment.

connect

Real-time Feedback

Make your products empathic by triggering real-time interactive feedback on changes in your users' moods and actions.

Emotion Science Empathic Tech 1200X900

A Primer on Emotion Science for our Autonomous Future

A summary of key scientific models of emotion, and how they are being applied to create tech that can measure & respond to human emotions.

Empathic Mobility 2

Why Your Car Should Know How You Feel

On the road to autonomous transportation, giving our vehicles the ability to measure our moods could be a matter of life and death.

Empathic Mobility Tech Stack Modern City

How to Build Emotionally Intelligent Transportation

A quick tour through the world of empathic mobility, when vehicles & the surrounding digital world know how you feel & what to do about it (story + free infographic).

Breaking new scientific ground with our academic partners, built on 20 years of leadership in the psychology of emotion.