Correctly identifying gait phases is a prerequisite to achieve a spatial/temporal characterization of muscular recruitment during walking. Literature reported few machine-learning-based approaches for gait-phases classification from surface electromyographic (sEMG) signal during treadmill walking. To our knowledge, no attempts were made during ground walking in daily-life conditions. A methodology for classification of stance/swing and prediction of foot-floor-contact signal during ground walking in conditions similar to daily life is proposed here, based on the application of Multi-Layer Perceptron models to sEMG signal alone. sEMG were acquired from eight lower-limb muscles in about 13.000 strides from 23 healthy adults, during ground walking, following an eight-shaped path including natural deceleration, reversing, curve, and acceleration. Classification and prediction accuracy were tested vs. the ground truth, represented by the basographic signal provided by three foot-switches, through samples not used in the learning phase, coming from both the same group of subjects used to generate the learning set (LS-Test) and brand-new subjects (unlearned, US). Results showed an average classification accuracy (± SD) over 23 folds of 94.9 ± 0.3 for LS-test and 93.4 ± 2.3 for US. Prediction of foot-floor-contact signal was quantified in terms of timing of heel strike and toe off: mean (over ten folds) absolute difference between predictions and footswitch data for UL was 15 ± 17 ms and 36 ± 22 ms for heel-strike and toe off, respectively. The suitable performance achieved by the proposed method suggests that it could be successfully used to automatically classify gait phases and predict foot-floor-contact signal from sEMG signals during ground walking in daily-life conditions.
A deep learning approach to EMG-based classification of gait phases during level ground walking / Morbidoni, C.; Cucchiarelli, A.; Fioretti, S.; Di Nardo, F.. - In: ELECTRONICS. - ISSN 2079-9292. - ELETTRONICO. - 8:8(2019), p. 894. [10.3390/electronics8080894]
A deep learning approach to EMG-based classification of gait phases during level ground walking
Morbidoni C.
;Cucchiarelli A.;Fioretti S.;Di Nardo F.
2019-01-01
Abstract
Correctly identifying gait phases is a prerequisite to achieve a spatial/temporal characterization of muscular recruitment during walking. Literature reported few machine-learning-based approaches for gait-phases classification from surface electromyographic (sEMG) signal during treadmill walking. To our knowledge, no attempts were made during ground walking in daily-life conditions. A methodology for classification of stance/swing and prediction of foot-floor-contact signal during ground walking in conditions similar to daily life is proposed here, based on the application of Multi-Layer Perceptron models to sEMG signal alone. sEMG were acquired from eight lower-limb muscles in about 13.000 strides from 23 healthy adults, during ground walking, following an eight-shaped path including natural deceleration, reversing, curve, and acceleration. Classification and prediction accuracy were tested vs. the ground truth, represented by the basographic signal provided by three foot-switches, through samples not used in the learning phase, coming from both the same group of subjects used to generate the learning set (LS-Test) and brand-new subjects (unlearned, US). Results showed an average classification accuracy (± SD) over 23 folds of 94.9 ± 0.3 for LS-test and 93.4 ± 2.3 for US. Prediction of foot-floor-contact signal was quantified in terms of timing of heel strike and toe off: mean (over ten folds) absolute difference between predictions and footswitch data for UL was 15 ± 17 ms and 36 ± 22 ms for heel-strike and toe off, respectively. The suitable performance achieved by the proposed method suggests that it could be successfully used to automatically classify gait phases and predict foot-floor-contact signal from sEMG signals during ground walking in daily-life conditions.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.