Automatic fall detection systems that can rapidly identify dangerous events are increasingly important for elderly care. This paper presents a novel fall detection methodology based on a modular four step pipeline: dataset preparation, skeletal keypoint extraction, advanced feature engineering, and temporal pattern analysis. The feature engineering process transforms human pose data into rich representations capturing ground proximity, body orientation, and motion dynamics-key biomechanical signatures that differentiate falls from daily activities. These engineered features are then analyzed through a temporal framework that models the distinctive sequential patterns characteristic of falling events, including LSTM, GRU, and Bidirectional LSTM architectures. For implementation, You Only Look Once (YOLO11m-pose) estimation was employed in the keypoint extraction step, while the temporal classifier module tests the above networks. Evaluations on the GMDCSA24 dataset demonstrate the approach's effectiveness. By utilizing skeletal pose data and a plug-and-play design, the approach offers multiple advantages, for example, robust performance across different lighting conditions, and adaptability to various deployment scenarios including healthcare facilities, homes for the elderly, and public spaces where continuous monitoring is necessary.

Preventing Elderly Injuries: A Pose-Based Framework for Fall Detection / Longarini, L.; Rongoni, A.; Dragoni, A. F.; Pompei, G.; Prist, M.. - (2025), pp. 759-764. ( 4th IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering, MetroXRAINE 2025 Ancona, IT 22 - 24 October 2025) [10.1109/MetroXRAINE66377.2025.11340043].

Preventing Elderly Injuries: A Pose-Based Framework for Fall Detection

Longarini L.;Rongoni A.;Dragoni A. F.;Prist M.
2025-01-01

Abstract

Automatic fall detection systems that can rapidly identify dangerous events are increasingly important for elderly care. This paper presents a novel fall detection methodology based on a modular four step pipeline: dataset preparation, skeletal keypoint extraction, advanced feature engineering, and temporal pattern analysis. The feature engineering process transforms human pose data into rich representations capturing ground proximity, body orientation, and motion dynamics-key biomechanical signatures that differentiate falls from daily activities. These engineered features are then analyzed through a temporal framework that models the distinctive sequential patterns characteristic of falling events, including LSTM, GRU, and Bidirectional LSTM architectures. For implementation, You Only Look Once (YOLO11m-pose) estimation was employed in the keypoint extraction step, while the temporal classifier module tests the above networks. Evaluations on the GMDCSA24 dataset demonstrate the approach's effectiveness. By utilizing skeletal pose data and a plug-and-play design, the approach offers multiple advantages, for example, robust performance across different lighting conditions, and adaptability to various deployment scenarios including healthcare facilities, homes for the elderly, and public spaces where continuous monitoring is necessary.
2025
9798331502799
File in questo prodotto:
File Dimensione Formato  
Longarini_Preventing-Elderly-Injuries-Pose-Based_2025.pdf

Solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza d'uso: Tutti i diritti riservati
Dimensione 1.19 MB
Formato Adobe PDF
1.19 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Preventing Elderly Injuries A Pose-Based Framework for Fall Detection.pdf

accesso aperto

Tipologia: Documento in post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza d'uso: Licenza specifica dell'editore
Dimensione 1.03 MB
Formato Adobe PDF
1.03 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/355417
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact