Software Engineers’ Guide to Building Empathic Systems with Human-Generated Data

Empathic Tech Alexander Sinn Kg Lt Fcgf C28 Unsplash 2000X1125 60Pc

As discussed in more detail in the first two parts of this series (see end), if you want to work with human data, gathered from sensors on or around the user of your system, there are several hurdles to overcome to establish a clean feed of consistent, synchronised data. But once you have that, you are now free to tinker in the whole new realm of intelligent systems called empathic technology. That is to say, technology that can measure, and respond dynamically, to the current feelings of the user. So what does this involve?

Note: This is part of a four-part series on Human Data-Based R&D – see end for more.

In simple terms, you have the system collecting the data on one side, including sensors, connection protocols, data recording and data management; while on the other side you have the output that you wish to generate from the rich dataset you are building. We focus on the second part of that workflow here, as we covered the collection and management of data in more detail in the previous two posts. Your output could have a passive manifestation, such as a data visualisation or report view, or it could be active or interactive, such as triggering actuators or UI functions based on live features and events coming from the data.

Equipment 004 2000X1333 60Pc
Sensors and mobile devices for wireless sensor connection and data recording.

In order to build any empathic system you have to deal with the fuzzy, subjective world of human psychology – and judging the optimal paths for which content or functions to throw up in response to which incoming data features is far from a solved problem. Empathy is a new stream in human-computer interaction, and the world is still far from discovering what the “killer features” and exemplar use-cases will turn out to be. So you need to be able to experiment. You need to be able to test, deploy and iterate – as fast as possible.

We have tried to tackle this need for rapid innovation cycles both in our own inhouse work, and in collaboration with our customers, which include some of the world’s biggest brands. It’s a major reason why we chose to package our system in the format of a developer platform. Our Synsis™ Empathic AI Kit, which is now being released in its second version, is designed to be as open and flexible as possible to avoid you having a long and complicated setup and learning period. Instead, you can quickly get cracking with the exciting stuff: building empathic systems.

The most instructive way to express this is with real-world examples, so let’s have some.

Example 1: Interactive Visualisation

In the summer of 2018 we put together an interactive data experience at the Goodwood Festival of Speed. Out on the track, we were wirelessly gathering heart rate data from racing drivers, who we provided with wearable biometric sensors. On the other side of the festival grounds, in the Future Lab exhibit, visitors were able to watch, and interact with, real-time models that predicted the drivers’ arousal levels, via giant touchscreens.

Goodwood Dashboard Screen Anthony Reid
Interactive touchscreen data visualisation at Goodwood Festival of Speed (in collaboration with Catapult and A&E)

The Synsis Kit facilitates this kind of setup by managing all sensor connections, data ingress, synchronisation and event tagging, as well as running our real-time human state models to predict the emotions and other states of the user. All this happens locally on the Kit. You can then pull, or subscribe, to both raw and derived data from the kit via standard protocols. We have a RESTful API, as well as GRPC and MQTT support, allowing you to work with the data in a way that suits you

You can read about the Goodwood project here: In the Sports War for Marginal Gains, Human Data Could be the Key

Example 2: Human-Responsive Car Cabin

We were challenged by our customer Valeo to demonstrate how the next generation of vehicles will be able to respond dynamically to how their occupants feel at any moment. In this example, the focus is on HVAC (the environmental settings in the cabin) aimed at making the occupant more comfortable without them having to manually adjust any controls. The demo showed how the environmental comfort settings can be driven by changes in the occupant’s biometric levels.

Ces19 7436
Sensum CEO Gawain Morrison at the Smart Cocoon demo at CES 2019.

For building responsive systems like this, the Synsis Kit comes with a REST API, through which you can push and pull streams. It is also powered directly from the OBD port of the car, which can hook you into the vehicle telematics. This provides a common format for you to connect the real-time data feeds from the occupant’s raw biometrics, or derived features, to trigger actuators in the vehicle.

For building responsive systems like this, we've aimed to make data ingress as easy as possible into the Synsis kit. For one thing, the kit can be powered directly from the OBD-II port of the car, providing straightforward access to vehicle telematics. The kit also supports video devices using V4L and USB, or can accept a network stream using RTMP/RTSP. And while we support some standard biometric peripherals out of the box, we can also support the outputs of proprietary devices being pushed to us via GRPC or sent as an MQTT message. The kit takes care of the multiplexing concerns, and accurate time-series recording of the unified data.

Having the data and workflow in these common formats, helps you to model and simulate responses, using logic to combine the occupant’s raw biometrics, or derived features, with context and vehicle data. Being able to simulate these responses and actuators in the vehicle, using accurately recorded sessions, allows for a rapid prototyping approach so you can iterate towards a robust product with minimal inertia.

This kind of empathic feedback loop can be applied not just to comfort or experience features, such as when dealing with stress or enjoyment, but also to critical safety features, in the event of detecting potentially dangerous states such as fatigue and distraction. And indeed, this kind of driver monitoring is coming into mandatory regulations, at least in Europe, as I write (see Euro NCAP Road Map 2025).

You can read more about the Smart Cocoon project here: News Story: Empathic Technology Showcase Reveals a Car That Knows How You Feel

OK, so that’s just a quick dip into the engineering steps involved in systems that can measure or respond to human states. There’s plenty more to discuss and we’re keen to know what your familiarity is with this new field of technology. Any recurring headaches and pinch-points? Any lessons learned? Your input is what drives our product roadmap so please tell us more: hello@sensum.co

Read on…

In this series:

Related:

Ben Bland

Chief Operations Officer