CEO and PhD, EMOJ, Italy
Title: The use of Artificial Intelligence for emotion-aware car interface
Biography:
Luca Giraldi is CEO of EMOJ, offering advanced technologies based on Artificial Intelligence techniques to revolutionize the world of Customer Experience. Its motto is “we are an artificial intelligence company, but we put human at first before artificial”. EMOJ operates in the field of automotive, retail and culture and is considered by Unicredit Startlab and Bocconi among the 10 successful Italian startups. Luca received a PhD in Industrial Engineering in 2019 and he is expert of digital transformation, customer experience, emphatic marketing.
Nowadays, driver monitoring is a topic of paramount importance, because distraction and inattention are a relevant safety concern crashes [1]. Currently, Driver Monitoring Systems collect and process dynamic data from embedded sensors in the vehicle and RGB-D cameras detecting visual distraction and drowsiness, but neglect the driver’s emotional states, despite research demonstrated precisely that emotion and attention are linked, and both have an impact on performances [3]. For instance negative emotions can alter perception and decision-making [4]. Consequently, explaining complicated phenomena such as the effects of emotions on driving and conceiving how to use emotions to decrease distraction need to be explored.
Today several methods and technologies allow the recognition of human emotions, which differ in level of intrusiveness [3]. Invasive instruments based on biofeedback sensors can affect the subjects’ behavior and the experienced emotions. Non-intrusive emotion recognition systems, i.e. those based on speech recognition analysis and facial emotion analysis, implement Convolutional Neural Networks (CNN) for signal processing. However no study has actually tested their effectiveness in a motor vehicle to enable emotion regulation [6].
In this context the research focuses on the introduction of a multimedia sensing network and an affective intelligent interface able to monitor the emotional state and degree of driver’s attention by analyzing the persons’ facial expressions and map the recognized emotions with the car interface in a responsive way to increase human wellbeing and safety. The adopted technology for emotion recognition is reported in [7] and is extended with additional features specific for vehicle environment to implement emotion regulation strategies in the human-machine interface to improve driving safety. The result is an emotion-aware interface able to detect and monitor human emotions and to react in case of hazard, e.g. providing warning and proper stimuli for emotion regulation and even acting on the dynamics of the vehicle. The figure reports the conceptual research framework. This work is part of a long-term project called NEXTPERCEPTION, funded from the European Union’s Horizon 2020 program ECSEL Joint Undertaking (JU) under grant agreement No. 876487.