The use of wearable devices with inertial sensors for gesture recognition is becoming increasingly common in extended reality and remote control applications. Fast and accurate gesture recognition is critical for real-time services such as industrial control and thus drives the shift of computation toward edge computing. This work aims to optimize deep learning models for direct integration into real-world edge devices. In this context, several models based on convolutional neural networks with different configurations are developed and tested through standard performance benchmarks (accuracy and Macro F1 Score). Moreover, in order to decrease models' size with minimal accuracy loss, they were quantized using full-integer quantization. Non-quantized and quantized models have been tested on different boards through the STM32 Edge AI Developer Cloud, as an additional benchmark to verify that the proposed models meet speed and accuracy requirements also on low-cost, low-power edge nodes. The results show that the best lightweight configuration with post-training quantization achieves an inference time of 17.31 ms on the STM32L4R9I-DISCO board, while maintaining an accuracy of 95.95 % ± 0.31 %, competitive against the state of the art and making it suitable for integration into industrial control frameworks.

A Lightweight CNN-Based Solution for Inertial Gesture Recognition on Tiny Edge Devices / Esposito, Marco; Raggiunto, Sara; Napoletano, Paolo; Belli, Alberto; Sciarroni, Monica Marconi; Storti, Emanuele; Pierleoni, Paola. - (2025). ( 30th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2025 Porto, Portugal 09-12 September 2025) [10.1109/etfa65518.2025.11205794].

A Lightweight CNN-Based Solution for Inertial Gesture Recognition on Tiny Edge Devices

Esposito, Marco;Raggiunto, Sara;Belli, Alberto;Sciarroni, Monica Marconi;Storti, Emanuele;Pierleoni, Paola
2025-01-01

Abstract

The use of wearable devices with inertial sensors for gesture recognition is becoming increasingly common in extended reality and remote control applications. Fast and accurate gesture recognition is critical for real-time services such as industrial control and thus drives the shift of computation toward edge computing. This work aims to optimize deep learning models for direct integration into real-world edge devices. In this context, several models based on convolutional neural networks with different configurations are developed and tested through standard performance benchmarks (accuracy and Macro F1 Score). Moreover, in order to decrease models' size with minimal accuracy loss, they were quantized using full-integer quantization. Non-quantized and quantized models have been tested on different boards through the STM32 Edge AI Developer Cloud, as an additional benchmark to verify that the proposed models meet speed and accuracy requirements also on low-cost, low-power edge nodes. The results show that the best lightweight configuration with post-training quantization achieves an inference time of 17.31 ms on the STM32L4R9I-DISCO board, while maintaining an accuracy of 95.95 % ± 0.31 %, competitive against the state of the art and making it suitable for integration into industrial control frameworks.
2025
9798331553838
File in questo prodotto:
File Dimensione Formato  
Real-Time Hand Gesture Recognition System with User Identification for Industrial Applications.pdf

accesso aperto

Tipologia: Documento in post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza d'uso: Licenza specifica dell'editore
Dimensione 631.57 kB
Formato Adobe PDF
631.57 kB Adobe PDF Visualizza/Apri
Esposito_Lightweight-CNN-Based-Solution_2025.pdf

Solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza d'uso: Tutti i diritti riservati
Dimensione 681.11 kB
Formato Adobe PDF
681.11 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/353733
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact