Product Leads’ Guide to Personalising User Experience with Empathic AI
Anyone building products or services that include sensors for collecting human data is facing a big opportunity. By being able to measure how your customers and users feel, you can personalise your technology to respond empathically to them, enhancing it for new levels of interaction and experience. But to get there you have to navigate a decision path that avoids getting stuck with poor investments that result in an inadequate product experience.
Note: This is part of a four-part series on Human Data-Based R&D – see end for more.
Here we outline some common considerations gathered from helping our customers design empathic products, services and experiences over the last decade.
The key strategic decisions we see our customers having to make while planning to bring empathic features into their products and services include deciding which:
- sensors to add to the system;
- metrics to measure from the sensor data;
- interactive functions to deliver as feedback to the user.
And overarching all of these decisions is the constant consideration of how to get to market as quickly and cheaply as possible, while protecting the user’s safety and personal rights as a priority, and still attaining optimal quality in the final experience you deliver.
It’s not easy. For anyone with strategic input on product development, the more you can do in the early stages of your development lifecycle to expose and mitigate issues, and discover value, the less your chances of expensive backtracking further down the line. The digital revolution has had an accelerative effect on innovation such that go-to-market timelines are shrinking year on year, while the complexity and risk inherent in the technologies that facilitate these workflows keep rising.
None of this is made easier when dealing with experimental new fields such as empathic AI, but you can’t wait for the competition to get out in front of you before dipping your toes in the water. So it is into this sensitive pathway of trade-offs that we are providing a toolkit to help you accelerate your R&D while maintaining maximum freedom of choice for how your creation will function when it finally reaches the market.
In brief, the milestones for product delivery when dealing with intelligent digital systems such as empathic tech are:
From working with some of the world’s biggest brands at various stages of the product lifecycle, we can see the value of human data throughout the process. From providing objective and nonconscious feedback from participants in early research, to driving dynamic features in the final product experience. And we have now packaged our technology into a single toolkit to go with you on this journey, regardless of the sensors, metrics and functions you decide to go with.
From the start, we have maintained a belief in being multimodal and sensor agnostic, in other words not being tied to a single human data model (e.g. facial coding, voice analysis, or a physiological metric) or a specific provider or type of sensor. The optimal set is different for everyone, and the solutions are changing all the time. Keeping your options open means you can adjust your setup as you go to maintain an optimal pathway.
As we proudly release the new version of our Synsis™ Empathic AI Kit, we are offering a developer platform for all your teams to use, through all the stages of product development, wherever you want to understand human data or connect the product experience to the user’s current state. To illustrate how you might do this, here are some simplified use-cases to consider.
1) Human-Responsive Car Cabin
Sensors throughout the vehicle cabin measure driver (or occupant) physiology and mood, monitoring for safety concerns (e.g. distraction, fatigue) as well as the ride experience (e.g. stress, frustration, happiness, comfort). Real-time analytics feed the vehicle’s systems to trigger a range of functions to help bring the driver or occupant back to a desired state (e.g. dashboard notifications, media content, driving assistance, HVAC controls, etc.) This kind of empathic vehicle design is increasingly being seen at the higher end of the personal market, and throughout corporate fleets – offering safer, more personalised and more enjoyable mobility experiences.
The Synsis Kit can be powered directly from the vehicle’s OBD-II port to gather vehicle telematics, while wirelessly syncing and processing a range of body, face, voice and context sensor feeds. This provides you with a holistic, real-time view of the driver or occupant experience on which to design responsive in-cabin features.
Example case studies:
- News Story: Sensum launches empathic tech toolkit for in-cabin safety, services and personalisation at IAA Frankfurt (customer: Valeo)
- Building the Ford Buzz Car (customer: Ford)
2) Optimising Human Performance Gains
In any environment where marginal performance variations are significant for results, modelling the internal state of the performer can provide winning insights. In competition, such as track racing or eSports, or in other high-stakes environments such as corporate leadership, teams can use empathic tech to explore how to achieve optimal states (e.g. calm focus). By measuring performers’ psychophysiological states in the wild when they are competing, a lot can be learned about what factors influence optimal or suboptimal outcomes, providing live biofeedback based on previously invisible factors.
High-performance teams in many sectors are starting to integrate biometrics and human state models into their data analytics stack, as well as using the data to help build more immersive and personal connections with their audiences by telling the performers’ stories at an emotional level.
Example case studies:
- In the Sports War for Marginal Gains, Human Data Could be the Key (customer: Goodwood Festival of Speed)
- Connecting Emotions to Extreme Sports & Next-Gen Media (customer: Red Bull Media House)
3) Dynamic Human-Computer Interaction
Real-time human state modelling can be used to optimise media content and interactive experiences. Measuring metrics such as engagement and excitement allow for individuals to participate in highly personalised interactions that are tailored to their current or long term moods and behaviours.
Across all digital media touchpoints, providers are using human data to develop tailored programming and interactive features that maximise attention, learning and enjoyment – at an individual level. Empathic AI allows you to automate this interactive workflow.
Example case studies:
- Sensum and Akamai reveal how consumers feel towards video streaming quality (customer: Akamai)
- Sensum at the Vauxhall International 2013 North West 200 (customer: BBC)
These examples only scratch the surface of the challenges and opportunities that empathic interaction brings to product and service design. We also took a deeper dip into how this field can be approached by the key teams in the development process – research, analytics and engineering – in the rest of this blog series (see below). But get in touch if you want to get deeper into the subject with us: firstname.lastname@example.org
In this series:
- Part 1: Researchers’ Guide to Human-Generated Data & Biometric Sensors – for Human Factors, UX, HX & Psychology Insights
- Part 2: Data Scientists’ Guide to Analysis & Modelling of Human Data from Biometric Sensors
- Part 3: Software Engineers’ Guide to Building Empathic Systems with Human-Generated Data