MIT researchers developing a in-car personal robot that aims to change the way we interact with our car.

MIT researchers and designers are developing the Affective Intelligent Driving Agent (AIDA) – a new in-car personal robot that aims to change the way we interact with our car.

AIDA (Affective Intelligent Driving Agent), is a social robot that acts as a friendly, in-car companion. AIDA is designed to use the driver’s mobile device as its face. The phone displays facial expressions and is the main computational unit to manage information presented to the driver. AIDA provides smart navigation, hands-free access to messages and meetings, and a comforting presence to drivers.

https://youtu.be/jCiTYytpMpQ
Drivers spend a significant amount of time multi-tasking while they are behind the wheel. These dangerous behaviors, particularly texting while driving, can lead to distractions and ultimately to accidents. Many in-car interfaces designed to address this issue still neither take a proactive role to assist the driver nor leverage aspects of the driver’s daily life to make the driving experience more seamless. In collaboration with Volkswagen/Audi and the SENSEable City Lab, MIT students are developing AIDA (Affective Intelligent Driving Agent), a robotic driver-vehicle interface that acts as a sociable partner.
AIDA elicits facial expressions and strong non-verbal cues for engaging social interaction with the driver. AIDA also leverages the driver’s mobile device as its face, which promotes safety, offers proactive driver support, and fosters deeper personalization to the driver.

AIDA communicates with the driver through a small robot embedded in the dashboard. “AIDA builds on our long experience in building sociable robots,” explains professor Cynthia Breazeal, director of the Personal Robots Group at the MIT Media Lab. “We are developing AIDA to read the driver’s mood from facial expression and other cues and respond in a socially appropriate and informative way.”

AIDA communicates in a very immediate way: with the seamlessness of a smile or the blink of an eye. Over time, the project envisions that a kind of symbiotic relationship develops between the driver and AIDA, whereby both parties learn from each other and establish an affective bond.

To identify the set of goals the driver would like to achieve, AIDA analyses the driver’s mobility patterns, keeping track of common routes and destinations. AIDA draws on an understanding of the city beyond what can be seen through the windshield, incorporating real-time event information and knowledge of environmental conditions, as well as commercial activity, tourist attractions, and residential areas.

Affective Intelligent Driving Agent (AIDA) Aims To Change The Way We Interact With Our Car
Credits – Courtesy of the SENSEable City Lab

“When it merges knowledge about the city with an understanding of the driver’s priorities and needs, AIDA can make important inferences,” explains Assaf Biderman, associate director of the SENSEable City Lab. “Within a week AIDA will have figured out your home and work location. Soon afterwards the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas,” says Biderman. “AIDA can also give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.”

AIDA was developed in partnership with Audi and the Volkswagen Group of America’s Electronics Research Lab. The AIDA team is directed by Professor Cynthia Breazeal, Carlo Ratti, and Assaf Biderman.