For the people who are totally or partially unable to move or control their limbs and cannot rely on verbal communication, it is very important to obtain an interface capable of interpreting their limited voluntary movements, in order to allow communications with friends, relatives and care providers, or to send commands to a system. This paper presents a real time software application for disabled subjects, suffering from both motor and speech impairments, that provides message composition and speech synthesis functionalities based on face detection and head tracking. The proposed application runs on portable devices equipped with Android Operating System, and relies upon the O.S.'s native computer vision primitives, without resorting to any external software library. This way, the available camera sensors are exploited, and the device computational requirements accomplished. Experimental results show the effectiveness of the application in recognizing the user's movements, and the reliability of the message composition and speech synthesis functionalities.

Low complexity head tracking on portable android devices for real time message composition / Montanini, Laura; Cippitelli, Enea; Gambi, Ennio; Spinsante, Susanna. - In: JOURNAL ON MULTIMODAL USER INTERFACES. - ISSN 1783-7677. - ELETTRONICO. - 9:2(2015), pp. 141-151. [10.1007/s12193-015-0174-7]

Low complexity head tracking on portable android devices for real time message composition

MONTANINI, LAURA;CIPPITELLI, Enea;GAMBI, Ennio;SPINSANTE, Susanna
2015-01-01

Abstract

For the people who are totally or partially unable to move or control their limbs and cannot rely on verbal communication, it is very important to obtain an interface capable of interpreting their limited voluntary movements, in order to allow communications with friends, relatives and care providers, or to send commands to a system. This paper presents a real time software application for disabled subjects, suffering from both motor and speech impairments, that provides message composition and speech synthesis functionalities based on face detection and head tracking. The proposed application runs on portable devices equipped with Android Operating System, and relies upon the O.S.'s native computer vision primitives, without resorting to any external software library. This way, the available camera sensors are exploited, and the device computational requirements accomplished. Experimental results show the effectiveness of the application in recognizing the user's movements, and the reliability of the message composition and speech synthesis functionalities.
2015
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/226831
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 8
  • ???jsp.display-item.citation.isi??? 7
social impact