The Evolution of In-Cabin Sensing, From One CES to the Next
Technology’s ability to understand the mental and physical state of its users has come a long way in the last few years. Tracing this journey through the explosive noise of CES at the start of each year, we have a curious story of an industry quietly finding its place in the world. A couple of weeks ago we were back in Vegas as a key partner of the global auto-supplier Valeo, supporting their Health & Wellbeing Ecosystem with our in-cabin empathic AI. Wandering the endless floors of the world’s biggest tech show felt very different to the first time we were there, when we were witnessing our industry finally coming out of the lab and into the public mindset.
CES 2016: Take-Off
When we first stumbled into the melee of CES in January 2016, we were showing our emotion-sensing solutions at the startup area in the Sands. It turned out to be a key moment in the history of our young industry. Apple announced its purchase of Emotient just as the conference began and the deluge was unleashed. At the CES press morning we were touted to 1,200 press and brand managers as one of the hot startups and technologies to check out during the conference. Consequently we were swamped by industry folks from Bosch to Audi.
Back then it felt like the engineers, product owners and R&D teams that had been slowly chipping away at human body data and sensors for a few years now had a genuine interest in defining products for in-cabin sensing. However, the biggest conversation at the time was about how the car was going to be able to drive itself, so that’s where most of the industry and media focus stayed.
CES 2018: Peak Volume
This time last year, end of Jan 2018, we were on the road on the US West Coast, touring round automotive R&D labs, meeting folks post-CES to discuss how we could work with them on sensor fusion for understanding human states.
On our tour we had heard car brands like Toyota (and particularly Toyota), Honda, Nissan, Hyundai and Byton shouting loud about body data and emotion measurement. In Toyota’s case, they had exhibited prominent illustrations of the full empathic customer journey, in which the technology would interact with you at a truly personal level throughout your day.
Tier 1 suppliers like Bosch, ZF and Panasonic were stating their case for this kind of technology to be found in the car of the future. And the location company Here was talking about the Internet of Everything, where big data, IOT & body data would all provide hyper-personal context to engaging with the world and tech around you.
There was a boldness about the companies that were stating this future direction. In automotive they were all Asian OEMs that were leading the charge, and it may be no coincidence that they are also producing robots that require human-measurement sensors and empathic human interaction. It felt futuristic, even to a company like ours, whose core business is building empathic tech solutions. It felt like the companies around us, to whom we’d been pitching for so many years, had finally had an awakening moment and were going to make an empathic-tech-fuelled world a reality.
CES 2019: Silent Running
Fast forward a year. It’s CES 2019. It was an altogether more subdued affair.
Interestingly, after Toyota had been leading the charge on empathic technology at their big stand in 2018, they had almost no presence this year. Nothing in the main hall. No grand visitor experience. They hadn’t ditched their message of the steps to an empathic future, but it was only being pushed at a couple of shared stands in quieter areas of the conference.
Byton still had human biometrics displayed on their pre-recorded dashboard demo, but it looked pretty much the same as last year in the car. And this year there were no big screens around the stand showing off their vision like they had in 2018.
This year Kia was the only OEM publicly shouting about the power of emotions in transport, with their READ (Real-time Emotion Adaptive Driving) demo, and they were really shouting about it as the future or transportation. Regular, well-rehearsed pronouncements from a compère, waxing lyrical about the value, accompanied by clips from the Kia engineering and research teams, broadcast across massive screens on their stand, with futuristic-looking pods for people to try out.
The only other company making a lot of public noise about human data was Panasonic. The value that they were touting ranged from physical stress and dangerous exertion in the workplace, and potentially any space, through to identifying engagement and emotional states for health and entertainment.
In the interests of full transparency... when I took part in their body-stress experience the results suggested that I am far from fit or flexible, and that I could really do with a bit of a post-Christmas/CES workout programme. Probably not too far from the truth, to be honest. Their emotions & gaze interface was able to see that I was there, and male, and largely neutral to the demo, which was also true.
I work in the game of empathic interfacing so I’m hardened with a scepticism of others claiming that their systems can get it right reliably, robustly, repeatedly. I actively hunt out that robustness in those that claim they can do it, and I can quickly smell the bullshit in this space.
Bullshit detection is especially relevant at trade shows, which are all about showing off concepts to see what bites. Companies exhibit demos that appear to work, without worrying about the scrutiny of engineers being able to give the system a good beating. That’s fine for marketing testing, as long as we don’t pretend it’s anything that it isn’t.
I think this is why CES19 was so much quieter than last year, in regards to human data and emotional interfacing. A lot of companies, some of whom we’ve been working with, have taken the initial bold steps into this space and realised how hard it really is. But in that same step, during 2018, they have seen that it’s possible. They’ve started to uncover the implicit rules for understanding and reacting to humans. They now understand that in an always-on, real-time tech age, our empathic interfaces also need to work in real time. Ultimately they are now aware that we need to take the time to do this right – to put the weight of data, research and ethics behind this problem.
Concurrent to the industry getting its hands dirty with empathic tech in 2018, we saw a leap in its education level with respect to this space. Perhaps the biggest lesson for the industry was in understanding that one data stream, typically starting with a camera for facial coding, isn’t enough to understand humans.
Our species has taken millions of years to evolve multiple sensors on our bodies, to take in real-time data from the outside world, to then be synced and amalgamated for the computation of useful features that we can then act on.
Everyone thought they could just use the cameras already out there in the world to do empathic interfacing. Now they know they need more data, and more data types, and that we don’t express meaningful signals with our face all the time.
The other big realisation has been that nobody actually knows what to do once you’ve identified a state like positivity, stress or fatigue. If you’re feeling tired, should the system fire up some AC/DC, turn the volume to 11 and wind the windows down? Or, if some jerk on the highway has just made you see red, should the chair start to massage you, while the nebuliser delivers a fine mist of lavender spray?
What if I prefer death metal to calm me in the midst of a traffic jam and I find massages creepy? Everyone will have their own flavour of appropriate response, widely varying across demographic, scenario, context or product. It’s been the same for human interactions throughout the history of our species.
These lessons learned throughout 2018 cast a different light on the conspicuous silence we had heard from many of the companies exhibiting at CES. All the action was still there, it was just behind closed curtains. Many of the meetings we had with big brands were hosted in side rooms. Even our own technology demo, which was part of a human-reactive cockpit exhibit in partnership with Valeo, was displayed in their invite-only innovation area.
All the serious players in the game had been screaming out their vision at CES18 and were now retreating into the quiet, serious matter of making empathic products a reality. Behind closed doors, the order-of-the-day this year was product demos, roadmap discussions, and hunger to find partners who could deliver on these roadmaps. It wasn’t about what was on public show. It was about the hotel suites and innovation areas, where executives were deliberating on how to ensure the technology would be developed properly in the next phase.
I expect when each company is ready, they will come out hard and fast, disrupting mobility, entertainment, gaming, mental health & wellbeing, the workplace and many more industries all at once. Watch this space because it’s gonna move fast. And it’s exciting to be right slap in the middle of it!