Car Kit

Synsis™ Developer Kit: a Decade of Empathic Technology in a Box

Analyse human data in minutes, not months, with the world’s first sensor-fusion kit for measuring real-time human states from body, face & voice data combined.

Accelerate your journey to custom human data models

For a decade we have been solving the challenges of sensor connections, synchronised data collection from multiple sources, and human-state labelling. Now we have packaged our algorithms and software into a single toolkit for customers to move fast on the road to empathic tech.

Product Shot

From VR and simulation to the racetrack – in the lab and in the wild, Synsis is designed to give you robust human-state modelling wherever your users go. Measure human data and trigger product features based on the user's current state, for richer human-machine interaction.

Define new Business Models

Rapidly prototype and test reactive and personalised experiences, using a human data-driven approach to create technology that responds to the way we feel.

Driver & Passenger Monitoring

Meet the requirements of product standards like Euro NCAP Road Map 2025 with real-time biometrics and a wide range of emotions and cognitive states including fatigue, distraction and stress – for safety, comfort and experiences.

One Solution for all your Teams

Accelerate your R&D roadmap, all the way from UX to advanced engineering, with the same tools for data gathering and modelling as for controlling actuators and infotainment systems.

Features

Functionality

Evaluate

Sensor Fusion

Evaluate

Real-time Data Feedback

Evaluate

In-car Deployment

Evaluate

Simulator Deployment

Evaluate

Lab Deployment

Evaluate

Tablet Interface

Evaluate

Biometric Analysis

Evaluate

Facial Analysis

Evaluate

Voice Analysis

Evaluate

Data Acquisition

Evaluate

RESTful API

Evaluate

Event Creator

Evaluate

Data Access

Evaluate

Data Backup

Evaluate

Data Privacy

Evaluate

GDPR Compliant

Evaluate

OTA Updates

Feature Output

Evaluate

Stress (RT)

Evaluate

Comfort (RT)

Evaluate

Arousal (RT)

Evaluate

Valence (RT)

Evaluate

Fatigue (RT)

Evaluate

Distraction (RT)

Evaluate

Face Emotion Features (RT)

Evaluate

Voice Emotion Features (RT)

Evaluate

Age

Evaluate

Gender

Evaluate

Location

Evaluate

Speed

Input Data Streams

Evaluate

Heart Rate (HR)

Evaluate

Heart Rate Variability (HRV)

Evaluate

Breathing Rate (BR)

Evaluate

Galvanic Skin Response (GSR)

Evaluate

Skin Temperature (ST)

Evaluate

Electrocardiogram (ECG)

Evaluate

Video Frame (RGB)

Evaluate

Video Frame (Infrared)

Evaluate

Audio Signal

Evaluate

Thermal Image

Sensors

Evaluate

Biometric Radar

Evaluate

Equivital Biometric Belt

Evaluate

Face Camera (RGB & IR)

Evaluate

Context Camera (RGB)

Evaluate

Thermal Camera (FIR)

Evaluate

Location Sensor (GPS)

Evaluate

Wearable (GSR)

Evaluate

Wearable (HR)

Services / Support

Evaluate

Custom Model Development

Evaluate

Annotation

Evaluate

Protocol Design

Evaluate

Technical Support

Evaluate

Custom Feature Development

Preferred Partners

We are proud of our selection of world-class tech partners, who we have validated for exceptional standards of quality and accuracy.
Read more about our preferred partners here.

fraunhofer
equivital
audeering

UX & Research: Collect human data in the wild

Gather data from a wide range of sensors for body, face & voice data. Add your own context data from almost any source.

Our patent-pending sensor-fusion solution brings all your data streams into one place – synchronised, tagged for events, and ready for analysis.

Engineering: Build human-reactive prototypes

Scientifically validated algorithms provide a comprehensive model of the human user across a wide range of emotions and other cognitive & physiological states.

Trigger events based on the user's current state, to test and build products that respond empathically, for richer, more personalised human-machine interaction.

Ethics

AI Ethics – From Principles to Standards

As a European company living and breathing GDPR, we have always taken a strong stance on privacy and human rights. This applies across all of our product processes, from how we gather data, to how we build models; from how we address potential bias, to how we consult our clients on building the empathic products of the future.

This ethos brought Sensum & IEEE together to initiate a working group to develop a new global ethics standard for empathic technology with the Institute of Electrical and Electronics Engineers (IEEE). Read more.

The P7014 Standard for Ethical considerations in Emulated Empathy in Autonomous and Intelligent Systems is the latest addition to the IEEE’s P7000 series of standards in development, all focused on different aspects of the ethics of autonomous and intelligent systems, and we would encourage you to join to help shape the future standards of this technology.