This paper introduces a new system capable of adaptively managing multimedia contents (e.g. music, video clips, etc.) and lighting scenarios based on the detected user's emotional state. The system captures the emotion from the user's face expression mapping it into a 2D valence-arousal space where the multimedia content is mapped and matches them with lighting color. Results of preliminary tests suggest that the proposed system is able to detect the user's emotional state and manage proper music and light colors in a symbiotic way

An Adaptive System to Manage Playlists and Lighting Scenarios Based on the User's Emotions

Generosi A.;Ciabattoni L.;Altieri A.;Ceccacci S.;Mengoni M.;Talipu A.;Turri G.
2019-01-01

Abstract

This paper introduces a new system capable of adaptively managing multimedia contents (e.g. music, video clips, etc.) and lighting scenarios based on the detected user's emotional state. The system captures the emotion from the user's face expression mapping it into a 2D valence-arousal space where the multimedia content is mapped and matches them with lighting color. Results of preliminary tests suggest that the proposed system is able to detect the user's emotional state and manage proper music and light colors in a symbiotic way
978-153867910-4
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/266294
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact