Modelling Human States with Empathic AI – Our Scientific Process

Audience Emotion 833225 16X9 2000X1125

As our digital world gets smarter, so too should be the processes by which we develop the products and services that go into it. To create Sensum’s empathic AI technology, we have developed a process over a decade of problem solving around the globe. It is conducted through rapid design iterations, underpinned by robust science, to unite the fields of psychology, biology and technology. Come with us on a tour through Sensum’s scientific approach to building human-state models, for a future in which our machines can finally start to understand their human users.

I won’t cover the theoretical basics of how to model states like emotions, as you can get that from our earlier post: A Primer on Emotion Science for our Autonomous Future. But in short: human states such as emotions can be correlated with physiological signals, such as changes in heart rate, skin conductance, facial expression or tone of voice. And these signals can be measured via sensors on or around the person.

Engineers Office 001 2443 Org 16X9 2000X1125
Data science in action, at the Sensum HQ in Belfast, N. Ireland

Strong Foundations

We are fortunate to have had a world-class emotion science institute as our partners since our company was founded nearly a decade ago. The School of Psychology at Queen’s University Belfast has been a recognised leader in affective computing throughout the twenty-odd years since the field first emerged, especially when it comes to creating ground-truth databases for emotion measurement. These days we maintain Knowledge Transfer Partnerships with the university, through which our team is bolstered by two full-time PhD graduates. With this arrangement, our company is resourced to exploit the university’s research and expertise while maintaining 100% of the IP generated, allowing us to commercialise this knowledge via our products and services.

Since our inception, we have built every project and product on scientific investigation, as can be expected when designing solutions for a whole new field of technology. Each new project brief has required an exploration into the ‘art of the possible’, to see if the desired human states can indeed be measured, and what technical challenges we must surmount to do so. Through this process we have worked not only with some of the world’s top brands but also with several academic institutions, including Ulster University, The University of Northampton, Goldsmiths, and Bangor University, authoring various scientific publications for journals and conferences along the way.

Along the way our tools have measured human data across 20 countries, online, in shops, in homes, in the wild; from 1 person to 15,000 people simultaneously, at 4,000 m, at 300 km/h; from the Arctic Circle to the Pacific Rim; in cars, planes, trains, boats, bicycles, motorbikes, parachutes, zip-lines... even a jetpack. We don’t know anyone else in our space with the same level of hands-on experience.

By having to solve new technical challenges in each project, we have built a range of tools, out of which we have pending applications for two patents, and have honed down a unique process of what we now see as a marriage of psychology, biology and technology. We have also built our team accordingly, maintaining a broadly even split of expertise across those three fields.

The Process

Ml Flow Fig2 500Pc
A Machine-Learning Process for Model Development

I’ll try now to quickly run us through the process of using digital technology to measure how someone is feeling, from a scientific perspective.

Starting with a hypothesis about how to correlate one or more physiological signals with the state that a person is feeling, we can turn to existing literature to investigate the idea and develop a comprehensive literature review.

Having gathered supporting evidence from the existing body of published science, we then design experiments for our in-house lab. In these tests we simulate an appropriate ground-truth state while measuring a range of biometric signals from on or around the person, along with any context data we can get our hands on (eg. location, time, etc.).

The insights from our lab tests give us the basis for creating or improving our human-state models. Each model aims to emulate the psychophysiological manifestation of a particular human state. Put another way, each model is a set of mathematical code, comprising an algorithm, along with some filters and parameters, that translates a specific set of biometric signals into a score for one or more human states.

Now that we’ve been to the lab and generated sufficient evidence for our new or updated model, the work typically splits along parallel tracks, with (1) our research team running further tests in ever more realistic and nuanced scenarios, (2) the data science team improving the models via a range of machine-learning experiments, and (3) the software engineering team turning the model into code for deployment in digital systems.

The most challenging aspects of the process tend to be:

Sensors:

  • Managing multiple wireless devices competing for the same airspace.
  • Onboarding new sensors as they come onto the market, and staying on top of updates to existing ones. And of course the same goes for any third-party software required for the system.

Data management:

  • Syncronising multiple data streams, all with different formats, frequencies and so on.
  • Building sufficiently large, high-quality data sets for extracting scientific insights and training algorithmic models.

Annotation:

  • Your data sets are only as good as your annotations. There is a constant need to provide labels and scores, both manually and automatically, to be able to make sense of the growing body of data.
  • The fuzzy, subjective nature of human emotions requires carefully designed R&D protocols, and carefully trained annotators.

The Tools for the Job

As we have gone through each iteration of the process and encountered each of the problems listed above, we have endeavoured to build systems that will help us do it more easily next time. Along this journey we created tools for interfacing with wireless sensors, recording & syncronising sensor data, running human-state models on the data in several code languages, and providing the resulting metrics for data analysis or interaction design. In recent months, we have finally been able to bring these technical solutions together into a single toolkit for measuring human states: meet our Synsis™ Empathic Technology Developer Kit.

Our Synsis Developer Kit, a Decade of Empathic Tech Solutions… in a Box

For anyone wanting to measure the way someone feels in response to a stimulus such as when interacting with a product, Synsis aims to get you past all the work outlined above, and just get you straight into it. No need to integrate new sensor SDKs, or get bogged down in data cleansing. No need to develop your own algorithmic models for interpreting the signals you record. Instead, Synsis allows you to focus on the valuable part, the goal: measuring people’s emotional response and, in many cases, testing ways for your product or service to respond appropriately to those feelings.

The Synsis kit comes packaged with a set of generalised models, developed through the scientific process described above. They serve as a platform from which to jump off into more advanced modelling to suit your specific needs. Across all industries and use-cases, we envisage that the new generation of empathic interaction will be led by those organisations who imbue their products and services with customised algorithms that have been designed to behave optimally in their particular setting. By iteratively looping through the model-design steps above, you can hone your models down to the sensitivity and robustness you need, whether that’s for life-saving safety applications or casual entertainment. Ultimately, the most advanced models may eventually be tailored right down to not only specific uses, but users.

The New Empathic Paradigm

We believe that the next big step for the burgeoning world of AI-enabled technology is to imbue that tech with the tools to understand and respond to its users’ feelings. This emerging generation of empathic technology brings with it world-changing potential and so deserves special consideration in its development and deployment. As digital products and services become smarter, more personalised and more intimate, we hope to see the leading providers taking more rigorous steps than their predecessors to lead their innovations with high-quality, ethically mature science. It’s the right thing to do.

Further Reading from our Blog

Ben Bland

Chief Operations Officer