With Synsis embedded in your product you can interpret data from any on- and off-body sensors to generate the emotion, behaviour & physiology insights that meet your business needs.


Our platform was designed from the start to be input-agnostic, allowing you to choose your sensors and data sources. Our patent-pending tools then clean, sync and tag the incoming data streams from multiple sources simultaneously.

Multimodal sensor fusion for:

  • Body biometrics
  • Face & voice analysis
  • Environmental data (for context)


The Synsis™ empathic AI engine applies a unified data modelling process to all your incoming data streams, to produce scientifically-validated human insights in real time. Synsis automatically prioritises the most relevant and high-quality data streams available at each moment, for more accurate, robust and consistent analysis.

Universal empathic AI for:

  • User states, eg: fatigue, stress, distraction.
  • Universal emotions, eg: arousal (excitement), valence (positivity), dominance (control).
  • Journey mapping: model users’ changing moods & behaviours over time.


Synsis goes beyond basic emotion labels to infer user states that are truly relevant and actionable. With these outputs you can customise your products, services and experiences to respond ‘empathically’ to your users from one moment to the next.

Empathic human-machine interaction for:

  • Continuous user measurement.
  • Optimising user engagement & entertainment.
  • Maximising user health, enjoyment and performance.

Product Features

Clean & Sync

All incoming data automatically with our patent-pending data management tools.

Run Synsis

In the cloud, locally in your product environment, or embedded on chip.

Universal Metrics

That give you only the user states & values that are most relevant to your business needs.

Real-Time Analysis

Of human states, across emotion, behaviour & physiology.


Synsis is built to understand data from your choice of sensors and the data types they output.

Modelled Scientifically

Synsis produces a 'dimensional' model of the user based on leading psychological methods.

Jaguar Data

For Human Data, Context is King

Current psychology goes far beyond simply labelling human states such as emotions in discrete categories like 'happy', 'sad', 'angry' or 'tired'. Instead, an appraisal must be made of the person's feelings, situation, history and other factors. Their experience of a situation can then be understood in its unique context.

Multimodal AI for a Multisensory World

Our product is designed to work like us humans do, by blending sensory data from multiple sources simultanously, and comparing them for accuracy, relevance and context. The Synsis empathic AI engine fuses data & media from many sources to provide much richer emotion, behaviour and physiology insights than any single data stream could.