Being able to recognize human activities is important for home monitoring, with the possibility for clinicians to early recognize possible cognitive or physical impairments, but also for rehabilitation and prosthetics control. Nowadays, with the diffusion of wearable devices embedding inertial sensors (IMU) in the market, e.g. bracelets or smartwatches, the attention is moved toward the possibility to automatically recognize daily activities just from this sensor information, including accelerometer and gyroscope. However, the majorities of the studies consider a small set of simple daily activities or high computationally demand models to recognize the chosen tasks. For this reason, the aim of the present study is to consider three shallow machine learning classifiers to recognize 17 different complex activities just from the inertial information of a myo armband, i.e. accelerometer, gyroscope and quaternions. The best performance was reached combining accelerometer and gyroscope features with an accuracy of 98.6% and a LDA model. On the other hand, the KNN seemed to be the most suitable classifier when dealing with quaternion information: accuracy equal to 80% was obtained against 75% of LDA and SVM. Obtained results outperformed classification performances present in the literature, highlighting also a possible role of quaternions for pattern recognition problems.

Leveraging Inertial Information from a Single IMU for Human Daily Activity Recognition / Scattolini, M.; Tigrini, A.; Verdini, F.; Iadarola, G.; Spinsante, S.; Fioretti, S.; Burattini, L.; Mengarelli, A.. - (2024). (Intervento presentato al convegno 2024 IEEE International Symposium on Medical Measurements and Applications, MeMeA 2024 tenutosi a Eindhoven, Netherlands nel 26-28 June 2024) [10.1109/MeMeA60663.2024.10596886].

Leveraging Inertial Information from a Single IMU for Human Daily Activity Recognition

Scattolini M.
;
Tigrini A.;Verdini F.;Iadarola G.;Spinsante S.;Fioretti S.;Burattini L.;Mengarelli A.
2024-01-01

Abstract

Being able to recognize human activities is important for home monitoring, with the possibility for clinicians to early recognize possible cognitive or physical impairments, but also for rehabilitation and prosthetics control. Nowadays, with the diffusion of wearable devices embedding inertial sensors (IMU) in the market, e.g. bracelets or smartwatches, the attention is moved toward the possibility to automatically recognize daily activities just from this sensor information, including accelerometer and gyroscope. However, the majorities of the studies consider a small set of simple daily activities or high computationally demand models to recognize the chosen tasks. For this reason, the aim of the present study is to consider three shallow machine learning classifiers to recognize 17 different complex activities just from the inertial information of a myo armband, i.e. accelerometer, gyroscope and quaternions. The best performance was reached combining accelerometer and gyroscope features with an accuracy of 98.6% and a LDA model. On the other hand, the KNN seemed to be the most suitable classifier when dealing with quaternion information: accuracy equal to 80% was obtained against 75% of LDA and SVM. Obtained results outperformed classification performances present in the literature, highlighting also a possible role of quaternions for pattern recognition problems.
2024
979-8-3503-0799-3
979-8-3503-0800-6
File in questo prodotto:
File Dimensione Formato  
MeMeA2024_MS.pdf

Solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza d'uso: Tutti i diritti riservati
Dimensione 964.84 kB
Formato Adobe PDF
964.84 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
1571007197 final.pdf

accesso aperto

Tipologia: Documento in post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza d'uso: Tutti i diritti riservati
Dimensione 899.8 kB
Formato Adobe PDF
899.8 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/333734
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact