Why Your Car Should Know How You Feel
On the road to autonomous transportation, giving our vehicles the ability to measure our moods could be a matter of life and death.
– Also published here on Medium.
As we hurtle along the road to autonomous transportation, giving our vehicles the ability to measure our moods could be a matter of life and death.
Cars don’t kill people, drivers do.
At least, that was the opening line I had already drafted for this story when the news of the first pedestrian fatality from an autonomous vehicle came from the US on Sunday. The incident starkly highlights an important issue relating to the disruptive innovation that the transport sector is currently seeing:
What is the optimal path for rolling out fully autonomous transport?
Even while driverless cars are still in this nascent, exploratory phase, they have already established a record of being far safer than any category of human driver. The transition from human to machine driving ‘could, by midcentury, reduce traffic fatalities by up to 90 percent’. Taking the World Health Organization’s current estimate of the annual global rate of traffic fatalities at 1.2 million, that’s about a million people alive who wouldn’t be otherwise.
Nevertheless there have been a few accidents during the testing of autonomous vehicles, and last Sunday a person died in one of them. Where should we place our limit of tolerance for the risks of developing driverless systems out in the public domain? If we overregulate and slow the technology’s progress, many needless injuries and deaths could occur. But if development is pushed too fast, people could suffer avoidable harm from systems that aren’t yet ready for the road.
We can’t just sit back and watch. The automotive industry has a history of waiting to be told to implement safety features — like airbags and ABS — before spending the money on doing so. Additionally, software companies are in the habit of releasing early, bugs-and-all, then iterating on the public product based on customer feedback. The current revolution in mobility services is being led by both of these types of business, with powerful lobbying capability and a ravenous hunger to get out in front of the competition.
On the other hand, there is a lot of misguided fear and reticence pushing back against innovation. Much of the population has an inherent lack of trust in autonomous technology, plus an abhorrence of the possibility of ‘killer robots’, which could slow the introduction of potentially life-saving products.
Somewhere on this antagonistic see-saw of debate is a balancing point, with minimal harm on one side and rapid innovation on the other. The public discussion on autonomous mobility needs to be cool-headed, considerate and educational if we are to keep steering closer to that fine boundary line of optimal progress.
However steeply we ascend the adoption curve, we will face a challenging period of transition from error-prone human driving to relatively safe autonomous transportation. Navigating our way through this handover period will be as much about user-experience design as it will be about developing the technological solutions themselves.
To pilot our way through the waters ahead we need to develop a deep understanding of the relationship between Man and Machine.
Innovation Driver 1: Empathy Saves Lives
The technology to detect, and potentially avoid, life-threatening driver errors could already be in our hands.
While us humans are still masters of the steering wheel, let’s look at addressing the top causes of car accidents. This HuffPost article puts distracted driving at the top of the list, followed by drunk driving then speeding.
There are many different ways to cut this statistical pie. For instance, having speeding and running red lights as separate factors from reckless driving seems odd. Surely they are both reckless? One thing I find particularly interesting is that falling asleep at the wheel clocks in at only 7% of accidents by one survey but accounts for a disproportionate amount of fatalities (21%). The same survey puts rear-enders at 23–30% of all crashes but I would guess they are less fatal.
Some of the tools for detecting these dangerous driver states are already available and others could be coming down the line very soon. If we can train our vehicles to be empathic — to measure human emotions, behaviour and physiology, and be able to respond appropriately — we could potentially solve many of the causes of accidents, injuries and fatalities on the road.
From our own R&D at Sensum, and the discussions we’ve had with automotive companies throughout the sector, these are some of the key driver states currently being tackled by empathic human-machine interaction:
Distraction: an apt concern in a time when our attention is constantly being hacked out of us by technology and media. Plotting the position of the driver’s head and direction of their gaze from an onboard camera can show how much they are paying attention to the road.
Fatigue: driving tired could be as bad as driving drunk, let alone actually falling asleep at the wheel. Signs of tiredness can appear in both internal biometrics (eg. heart or breathing rate) as well as external physiology (eg. blink rate and head-bobbing, as the driver starts to nod off).
Intoxication: despite wide legislation and a huge shift in cultural attitudes, drink-driving still happens, and still kills. Promising new technologies could embed a non-intrusive alternative to a breathalyser in our dashboard. For instance, an infrared camera could detect the telltale heat pattern in the body of someone who is inebriated (eg. warm nose, cool forehead).
Stress: our emotions are tightly linked to decision-making, and affect our ability to concentrate. Signals of stress, anxiety, anger and other overwhelming emotions can be measured in a wide range of physiological patterns, such as heart rate variability (from contact-sensors on the driver), facial expression (from cameras) or voice tone (from microphones.
Let’s face it, anyone could experience one or more of the above states without realising it, or at least underestimating its impact in the moment. The more impaired we are, the more impaired our self-assessment can be. Thus there is a vital case for training the computers in our vehicles to know us better than we know ourselves when we are in the driver’s seat.
The sensors and processing required to detect some of these life-threatening driver states don’t require the imagination of a sci-fi writer or the pockets of an oligarch. There are off-the-shelf sensors (such as cameras, microphones and wearables) that could be used to do the job and are available right now at consumer-level prices, if not already built into some vehicles.
Innovation Driver 2: a Better Ride
To achieve the goal of implementing empathic capabilities into transportation may be more a challenge of design than technology, because people might just not be ready for it yet.
This point was nicely expressed in another HuffPost article: ‘air flight took many years before most people lost fear of being rocketed through the stratosphere at 500 mph in a pressurized tube propelled by exploding jet fuel. But few air travelers bat an eyelash now, calmly completing crossword puzzles and productively working on wifi as if it’s normal for humans to fly’.
Today, AI and autonomous vehicles face a comparable cultural hurdle to other transformative technologies of the past. It’s not enough to say that AI will almost eradicate road deaths. We have to go further, we have to make it so powerful and intuitive that it quietly disappears from view and is delightful to interact with. This means not only designing empathic functions that provide a more comfortable and entertaining ride but also optimising the entire user experience based on the data that empathic human-machine interactions provide.
By installing sensors and empathic processing software in vehicles now, automakers can begin gathering insights into what users will find valuable, acceptable, or creepy. The most innovative automotive companies have begun researching and prototyping in this area but could go much further by accelerating the rollout of in-cabin sensor technology and data gathering into their production vehicles — albeit with clear consent from the occupants being measured.
In summary, the effort of training our vehicles to measure and respond to our emotions and physiology should be a priority for the automotive industry. Detecting the driver’s moment-by-moment state could be a major step towards minimising road-deaths and injury. Further, the development of empathic human-machine interaction could provide the user insights we need to facilitate the tricky process of handing the steering wheel over to a computer.
You can read more about how such empathic interaction is possible, and why it matters, in our recent stories:
- How to Build Emotionally Intelligent Transportation (incl. free infographic)
- The Role of Human Emotions in the Future of Transport
- Empathic AI: The Next Generation of Vehicles Will Understand Your Emotions
- Measuring Emotion in the Driver’s Seat: Insights from the Scientific Frontline
- Sensor Fusion: The Only Way to Measure True Emotion