3 Steps to Empathy: How to Win the Race for Personalised Tech

3 Steps Illustration Text Wide

We are seeing the birth of a new generation of deeply personalised products, powered by AI that can understand and adapt to the individual user in the moment. The tech companies we are talking to in every sector are racing to establish their own version of this new level of rich user experience by adding empathic AI into their products. If you’re in one of those companies, a big question you are probably looking to answer is: how do we win the race?

Integrating empathic AI into your products requires innovation at the meeting-point of biology, psychology, data and software engineering. That is a rare mix of capabilities for any company. It’s why our team and our tools are here to accelerate your go-to-market journey at every step – through research, prototyping and build

Based on thousands of interactions with engineers, research directors, innovation managers, investors and many more, we have distilled a set of general steps to onboarding empathic AI, and built our product and service offering around them. These three steps are applicable regardless of what area of business you are in, but for clarity I have provided recent examples from our work in the mobility sector. Here's an overview of the journey:

3 Steps Table 003 Wide

Let's break this down into more detail...

Step 1: R&D – Finding a Path

Creation of any good product starts with an understanding of the user. In a human-machine interaction environment, we now have the technology to measure human responses to product features and stimuli in real time, across a wide range of biometrics. This offers a whole new level of user insight, down to a personal, emotional level, and can give you the knowledge you need to start to see what will be most useful, valuable and relevant to your product context.

As with the creation of anything complex, you must steer clear of assumptions and solutions, and instead start with testable hypotheses that will illuminate the best path forward. So you start running experiments, aimed at answering questions like:

  • Which sensors will work best in my product environment?
  • How do my users feel about their physiology and emotions being measured, or by the product responding to them? When is it effective and satisfying, and when is it annoying, distracting or harmful?
  • Which areas of human feelings should we focus on, eg. stress, excitement, engagement?
  • What are the best machine actions that my product could provide in response to the user’s current state?
  • What are the minimum number of sensors that will be useful?
  • What are the variations in user experience across different demographics or customer categories?

Anyone familiar with building new technologies will understand the challenge here. It is about starting down the development road with incomplete knowledge of the landscape ahead. If you wait until you have everything in place, when the market has matured and the product opportunities are obvious, you will be left behind. To solve this, successful innovators design experiments around a set of relevant variables to discover the likely best solutions through learning and deduction.

Empathic AI research involves measuring the emotional response of users to specific stimuli, from which you can derive insights about what kinds of product features and user interactions may work best for you down the line.

Driver Emotion Test 001 Col 16X9 1920X1080
Sensum Mobile Research Kit deployed in a car for real-life driving research, measuring body biometrics, face expression and voice parameters, as well as other context data and media.

Sensum Solution – Research Tools

For this need, we developed our Mobile Research Kit – a plug-and-play toolkit for deploying human-state research in almost any environment. It solves the challenge of connecting to multiple sensors – biometric wearables, video cameras (including infrared) and microphones – then collecting, syncing and tagging the data and media in one place before sending it into an empathic AI processor like ours to derive metrics from it all. We created a smartphone app for quickly connecting to sensors and setting up research sessions, and a dashboard for interrogating the raw & derived data, all so you can start testing users immediately, whether that’s in the lab or out in the wild.

To support your research, from controlled lab conditions to real-life testing, we have a well-honed fieldwork capability – we call it Siteworx – with which we have conducted groundbreaking research and developed fundamental science, in locations as diverse as racetracksoutdoor festivals and chilly mountainsides.

Example – Driver Research

This short behind-the-scenes video shows us using our Mobile Research Kit to explore the emotions of drivers on the road:

Step 2: Developer Tools – Putting in the Data Miles

Now that you have established broad insights from your research phase, you can begin to apply them to specific product contexts and narrow down the details of your potential build. Before you can, or should, commit major resources to any one solution, you have the opportunity now to trial different versions in rapid-prototyping iterations of build-measure-learn.

Fail fast, fail cheap.

At this stage you may be testing a wide range of potential solutions and interactive scenarios, or simply building a demo for temporary use. The engineering challenge here is to set up a fully functioning empathic AI workflow with minimal resource cost. This needs to cover:

  • Sensor selection and setup, covering your required range of human data types (eg. face, voice, body biometrics).
  • Data ingress, synchronisation, and cleansing.
  • Data processing and human-state modelling (to derive useful insights about the user).
  • Testing potential feedback and machine responses to derived features, such as visualising the data or controlling the machine’s actuators and settings.

To achieve all of this fast and cheap, you can piece together prototype versions of potential products using low-cost, off-the-shelf parts.

Sensum Solutions – Developer Tools

For this need, we can provide a custom-built Developer Box with all the hard- and software required to allow your engineers to plug our empathic processing capabilities into your existing technology with minimal customisation or development. This can be conducted either online through our cloud API, or by connecting your system to a physical developer box that we supply with our full system running on it.

We have deployed both our cloud and physical-box options in various environments for some of the planet’s biggest companies. We provide remote or onsite technical support and collaborate with our customers to co-develop solutions at high speed.

3 Steps Individual 002 Em Sight

Example – The Ford Buzz Car

Read about how we added real-time measurement of the ‘buzz’ of driving a Ford Performance car.

Ford Lights 3 1600X1033
The Ford Buzz Car, a demo empathic vehicle running Sensum Smarts under the hood

Step 3: Embedded Tools – Winning the Race

At this point, we now know which parameters offer the greatest value to the customer, having identified the best sensors, metrics and machine interactions to focus on. We can now be confident of avoiding expensive mistakes and unproductive investments. The challenge has moved away from selecting the best path, to detailed considerations of technical architecture and optimisation – this is the final sprint to win the race.

This is where we want to get to with all our customers. And often the conversation starts out as if they are already there. But in reality this level of product maturity is still rare in this field. You have to go through each step of the journey to rule out several options and set your sights on a narrow range of targets. In the famous words of Steve Jobs, ‘innovation is saying “no” to a thousand things’. Only with this clear focus can you confidently integrate a complex new tool into your stack.

Sensum Solutions – Embedded Tools

To solve this final step for our customers, we have spent much of the last year or so compiling our system into low- and mid-level code libraries that can be embedded straight into any product that has the ability to process software onboard, whether that be a car, a computer or a chip.

We provide full support not just for embedding the empathic AI processing software, but also to tailor the solution to optimally deliver your product requirements. We are currently running an Alpha Programme with selected customers to provide and support the embedded solutions they need.

3 Steps Individual 003 Synsis

Example

While we can’t go into much detail at this time, our recent CES review story provides a glimpse at the in-cabin sensing development we have been doing with Valeo to develop vehicles that react to the current state of their occupants.

Finally: How Far Have You Come?

We provide custom tools and expert service for every step of the journey to enhancing your products with empathic AI. We would be happy to discuss where you have go to so far, and which solution will suit you best. So, are you ready to take the next step?

Ces19 7436

Ben Bland

Chief Operations Officer