Human-robot interaction and affective computing are cross-disciplinary fields whose connections are being further explored with the proliferation of humanoid robots that assist people in their daily activities. The proposed study aims to assess emotion recognition accuracy based on facial image acquisition, recorded by the camera sensor of the humanoid robot Pepper, during a human-robot interaction scenario. The emotions of the individuals involved were elicited by viewing five different video clips, each corresponding to different target emotions (e.g., happiness, sadness, anger, surprise, and neutrality). Emotion evaluation was carried out using the Pepper integrated classification module and convolutional neural networks, and the obtained results were compared to determine the best emotion classification accuracy. The test group was diverse, particularly in terms of participant age, including a significant number of elderly individuals who were not familiar with robot interaction. This diversity also allowed for an examination of user acceptability in the context of the human-robot interaction experience.
Emotion recognition by facial image acquisition: analysis and experimentation of solutions based on neural networks and robot humanoid Pepper / Amabili, Giulio; Iarlori, Sabrina; Millucci, Samuele; Monteriu', Andrea; Rossi, Lorena; Valigi, Paolo. - (2023), pp. 4784-4791. (Intervento presentato al convegno 2023 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2023 tenutosi a Istanbul, Turkiye nel 2023) [10.1109/BIBM58861.2023.10385696].
Emotion recognition by facial image acquisition: analysis and experimentation of solutions based on neural networks and robot humanoid Pepper
Iarlori SabrinaSecondo
;Monteriu' Andrea;Valigi PaoloUltimo
2023-01-01
Abstract
Human-robot interaction and affective computing are cross-disciplinary fields whose connections are being further explored with the proliferation of humanoid robots that assist people in their daily activities. The proposed study aims to assess emotion recognition accuracy based on facial image acquisition, recorded by the camera sensor of the humanoid robot Pepper, during a human-robot interaction scenario. The emotions of the individuals involved were elicited by viewing five different video clips, each corresponding to different target emotions (e.g., happiness, sadness, anger, surprise, and neutrality). Emotion evaluation was carried out using the Pepper integrated classification module and convolutional neural networks, and the obtained results were compared to determine the best emotion classification accuracy. The test group was diverse, particularly in terms of participant age, including a significant number of elderly individuals who were not familiar with robot interaction. This diversity also allowed for an examination of user acceptability in the context of the human-robot interaction experience.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.