Smart Eye Deputy CEO Rana el Kaliouby talks automotive AI
Rana el Kaliouby co-founded and led Boston startup Affectiva, which uses artificial intelligence and computer vision to analyze mood and emotion.
Now she’s got a new job as deputy CEO of Smart Eye, after the Swedish eye-tracking company bought Affectiva for $73.5 million in June.
The auto industry is the prime market for el Kaliouby and competitors like Australia-based Seeing Machines. Carmakers are bracing for new safety rules and standards around the world that could require dashboard cameras to detect dangerous driver behavior, especially in vehicles that are partly driving themselves but still need human attention.
El Kaliouby says that’s just the beginning of where in-car AI systems are going. This interview has been edited for length and clarity.
Q: Ten years from now, a family’s in a car. What might your technology be doing on their trip?
A: OK, family’s in the car. You’ve got two kids in the back seat. First of all, the kids are fighting. The car knows that and can see that mom, who’s driving, is getting frustrated, a little mad, distracted. The car intervenes by recommending content for the kids—or through a conversational interface, mediating a game between the kids. They play for a little. They fall asleep. The car can see that so the lights dim and the music or movie turns off. Then the car realizes mom is exhausted and also starting to doze off, so it gets into this chatty mode to reengage her. And then mom leaves the car, forgets the child is in there, and gets a text message that says, “Oh, you may have forgotten Little Baby Joe!’ I’m making this up on the fly. It can basically personalize the whole cabin experience—music, lighting, temperature, based on knowing who’s inside the car and what they’re doing.
Q: What is Affectiva bringing to Smart Eye, and vice versa?
A: Smart Eye is a 22-year-old company. What they’ve been focused on the past couple years—and they are the undisputed market leader—is driver monitoring. They’re able to very accurately determine where a person is looking and they also monitor eye behavior. They can identify when a driver is distracted or drowsy. They’ve been contracted by 13 global automakers. Affectiva spun out of MIT 12 years ago and our focus is humanizing technology by bringing emotional intelligence to machines. We project there’s going to be an evolution in driver monitoring to everything that’s happening inside the vehicle. What are their mood and emotions? What activities are they engaged with? You become the eyes and ears of the car.
Q: How do you detect someone’s mood or emotions?
A: We do a lot of facial analysis but we’ve expanded to do a lot of body “keypoint” tracking so we can detect what people are actually doing—are you slouched in the car? Are you agitated? We monitor all of that.
Q: What about someone’s face will tell you they’re panicked?
A. There are expressions of fear. You can also start tracking other vital signs, like your heart rate or heart rate variability, breathing rate, via an optical sensor. That’s a direction we’re headed. It’s not at all ready for prime time but it’s something Affectiva and Smart Eye are exploring. And once you know a person’s baseline, you can find out if they are deviating from that baseline and the car can flag that.
Q: How do you protect against concerns you can misread someone’s emotion or mood based on race, gender, neurodiversity?
A: This is one of the things Affectiva’s really bringing to the table. It’s something we’ve been super intentional about. It starts with the diversity of the data. If you’re training an algorithm using middle-aged white men, that’s what it’s going to learn. The training set is critical and it’s everything from racial and ethnic diversity to diversity of facial appearances—maybe people are wearing glasses or hijabs or have beards. We’re partnering with synthetic data companies to augment our data sets and fill in the gaps. The second thing is, how do you validate the accuracy of the algorithms? If you just look at high-level accuracy, it might be hiding biases that exist in specific subpopulations. We dissect the data to make sure no bias is creeping in. And finally, the diversity of the team is how you overcome these blind spots.
Q: What about the privacy of people who don’t want to be analyzed or watched in the car?
A: In automotive, the good news is none of the data gets recorded. You do all the processing on the fly and make an inference, say, if the driver is drowsy. The car will hopefully respond to keep the driver safe. I think there needs to be a lot of consumer communication and transparency about what exactly the sensor is doing. I imagine there will be scenarios where you can switch it off. But if it’s a safety consideration, like your semi-autonomous vehicle needs to know if you are paying attention so it can transfer control back and forth, I imagine you may not be allowed to turn it off.
Invention uses machine-learned human emotions to ‘drive’ autonomous vehicles
© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.
Citation:
Smart Eye Deputy CEO Rana el Kaliouby talks automotive AI (2021, July 27)
retrieved 27 July 2021
from https://techxplore.com/news/2021-07-smart-eye-deputy-ceo-rana.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.