The detection of the environment where the user is located is of extreme use for the identification of Activities of Daily Living (ADL). ADL can be identified by use of the sensors available in many off-the-shelf mobile devices, including magnetic and motion, and the environment can be also identified using acoustic sensors. The main objective of this study is to recognize the environments and some standing ADL to include in the development of a framework for the recognition of ADL and its environments. The study presented in this paper is divided in two parts: firstly, we discuss the recognition of the environment using acoustic sensors (i.e., microphone), and secondly, we fuse this information with motion and magnetic sensors (i.e., motion and magnetic sensors) for the recognition of standing ADL. The recognition of the environments and the ADL are performed using pattern recognition techniques, in order to develop a system that includes data acquisition, data processing, data fusion, and classification methods. The classification methods explored in this study are composed by different types of Artificial Neural Networks (ANN), comparing the different types of ANN and selecting the best methods to implement in the different stages of the system developed. Conclusions point to the use of Deep Neural Networks (DNN) with normalized data for the identification of ADL with 85.89% of accuracy, the use of Feedforward Neural Networks (FNN) with non-normalized data for the identification of the environments with 86.50% of accuracy, and the use of DNN method with normalized data for the identification of standing activities with 100% of accuracy.

User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living / Pires, Ivan Miguel; Garcia, Nuno M; Pombo, Nuno; Flórez-Revuelta, Francisco; Spinsante, Susanna; Teixeira, Maria Canavarro; Zdravevski, Eftim. - In: STATISTICS, OPTIMIZATION & INFORMATION COMPUTING. - ISSN 2310-5070. - ELETTRONICO. - 7:1(2019), pp. 1-25. [10.19139/soic.v7i1.548]

User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living

Garcia, Nuno M
Membro del Collaboration Group
;
2019-01-01

Abstract

The detection of the environment where the user is located is of extreme use for the identification of Activities of Daily Living (ADL). ADL can be identified by use of the sensors available in many off-the-shelf mobile devices, including magnetic and motion, and the environment can be also identified using acoustic sensors. The main objective of this study is to recognize the environments and some standing ADL to include in the development of a framework for the recognition of ADL and its environments. The study presented in this paper is divided in two parts: firstly, we discuss the recognition of the environment using acoustic sensors (i.e., microphone), and secondly, we fuse this information with motion and magnetic sensors (i.e., motion and magnetic sensors) for the recognition of standing ADL. The recognition of the environments and the ADL are performed using pattern recognition techniques, in order to develop a system that includes data acquisition, data processing, data fusion, and classification methods. The classification methods explored in this study are composed by different types of Artificial Neural Networks (ANN), comparing the different types of ANN and selecting the best methods to implement in the different stages of the system developed. Conclusions point to the use of Deep Neural Networks (DNN) with normalized data for the identification of ADL with 85.89% of accuracy, the use of Feedforward Neural Networks (FNN) with non-normalized data for the identification of the environments with 86.50% of accuracy, and the use of DNN method with normalized data for the identification of standing activities with 100% of accuracy.
2019
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/263480
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact