In this work, we address the problem of estimating full-field planar strain when a reference image of the undeformed configuration is not available. Classical Digital Image Correlation (DIC) and related image-based methods rely on the comparison between undeformed and deformed states and therefore cannot be applied in such scenarios. To overcome this limitation, we propose a deep learning-based approach that estimates the in-plane strain state directly from a single deformed image. A Convolutional Neural Network is trained to regress the two principal strains and their orientation by learning deformation-induced texture features, formulating strain estimation as a data-driven regression problem. The method is validated using synthetically deformed images generated from both classical DIC speckle patterns and steel microstructure images acquired by optical microscopy. A systematic sensitivity analysis is performed to assess the influence of subset size and training dataset size. On synthetic data, the proposed approach achieves coefficients of determination exceeding 0.96 and average strain errors on the order of 0.01 m/m. Preliminary validation on real experimental images of deformed patterns demonstrates that the method can capture meaningful strain distributions, although with lower precision than reference-based DIC, as expected from a statistical, reference-free formulation. The proposed approach is therefore not intended to replace classical DIC, but to enable strain estimation in experimental situations where reference configurations are unavailable.

A deep learning based method for reference-free full-field strain measurement / Rossi, Marco; Tanoni, Giulia; Ilari, Veronica; Sasso, Marco; Principi, Emanuele. - In: ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE. - ISSN 0952-1976. - 172:(2026). [10.1016/j.engappai.2026.114302]

A deep learning based method for reference-free full-field strain measurement

Rossi, Marco
;
Tanoni, Giulia;Ilari, Veronica;Sasso, Marco;Principi, Emanuele
2026-01-01

Abstract

In this work, we address the problem of estimating full-field planar strain when a reference image of the undeformed configuration is not available. Classical Digital Image Correlation (DIC) and related image-based methods rely on the comparison between undeformed and deformed states and therefore cannot be applied in such scenarios. To overcome this limitation, we propose a deep learning-based approach that estimates the in-plane strain state directly from a single deformed image. A Convolutional Neural Network is trained to regress the two principal strains and their orientation by learning deformation-induced texture features, formulating strain estimation as a data-driven regression problem. The method is validated using synthetically deformed images generated from both classical DIC speckle patterns and steel microstructure images acquired by optical microscopy. A systematic sensitivity analysis is performed to assess the influence of subset size and training dataset size. On synthetic data, the proposed approach achieves coefficients of determination exceeding 0.96 and average strain errors on the order of 0.01 m/m. Preliminary validation on real experimental images of deformed patterns demonstrates that the method can capture meaningful strain distributions, although with lower precision than reference-based DIC, as expected from a statistical, reference-free formulation. The proposed approach is therefore not intended to replace classical DIC, but to enable strain estimation in experimental situations where reference configurations are unavailable.
2026
Deep learning; Digital Image Correlation; Experimental mechanics; Reference-free measurement; Strain measurement
File in questo prodotto:
File Dimensione Formato  
Rossi_Deep-learning-based-method_2026.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza d'uso: Creative commons
Dimensione 5.5 MB
Formato Adobe PDF
5.5 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/354492
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact