Modal interactions are pervasive effects that commonly emerge in nanomechanical systems. The coupling of vibrating modes can be leveraged in many ways, including to enhance sensing or to disclose complex phenomenologies. In this work we show how machine learning and data-driven approaches could be used to capture intermodal coupling. We employ a quasi-recurrent neural network (QRNN) for identifying mode coupling and verify its applicability on experimental data obtained from tapping mode atomic force microscopy (AFM). Hidden units of the QRNN are monitored to trace fingerprints of modes activation and to quantify their contributions over the global distortion of orbits in the phase space. To demonstrate the broad applicability of the method, the trained model is re-applied over different experiments and on diverse materials. Over a range of tip-sample configurations, dynamic AFM possesses features general enough to be seized by the QRNN and it is not required an ad-hoc re-training for the identification of interacting modes. Our study opens up a route for utilizing established machine learning techniques for rapid recognition of nonlinear complex effect such as internal resonances in nanotechnology. The QRNN analysis is meant to assist AFM sensing operations when exploiting modal interaction to enhance the signal-to-noise ratio of higher harmonics and provide high resolution compositional contrast in multi-frequency AFM applications.

Machine learning to probe modal interaction in dynamic atomic force microscopy / Belardinelli, P.; Chandrashekar, A.; Wiebe, R.; Alijani, F.; Lenci, S.. - In: MECHANICAL SYSTEMS AND SIGNAL PROCESSING. - ISSN 0888-3270. - 179:(2022). [10.1016/j.ymssp.2022.109312]

Machine learning to probe modal interaction in dynamic atomic force microscopy

Belardinelli P.
;
Lenci S.
2022-01-01

Abstract

Modal interactions are pervasive effects that commonly emerge in nanomechanical systems. The coupling of vibrating modes can be leveraged in many ways, including to enhance sensing or to disclose complex phenomenologies. In this work we show how machine learning and data-driven approaches could be used to capture intermodal coupling. We employ a quasi-recurrent neural network (QRNN) for identifying mode coupling and verify its applicability on experimental data obtained from tapping mode atomic force microscopy (AFM). Hidden units of the QRNN are monitored to trace fingerprints of modes activation and to quantify their contributions over the global distortion of orbits in the phase space. To demonstrate the broad applicability of the method, the trained model is re-applied over different experiments and on diverse materials. Over a range of tip-sample configurations, dynamic AFM possesses features general enough to be seized by the QRNN and it is not required an ad-hoc re-training for the identification of interacting modes. Our study opens up a route for utilizing established machine learning techniques for rapid recognition of nonlinear complex effect such as internal resonances in nanotechnology. The QRNN analysis is meant to assist AFM sensing operations when exploiting modal interaction to enhance the signal-to-noise ratio of higher harmonics and provide high resolution compositional contrast in multi-frequency AFM applications.
2022
File in questo prodotto:
File Dimensione Formato  
23 Machine learning to probe modal interaction in dynamic atomic force microscopy - preprint.pdf

accesso aperto

Tipologia: Documento in pre-print (manoscritto inviato all’editore precedente alla peer review)
Licenza d'uso: Creative commons
Dimensione 5.17 MB
Formato Adobe PDF
5.17 MB Adobe PDF Visualizza/Apri
1-s2.0-S0888327022004502-main.pdf

Solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza d'uso: Tutti i diritti riservati
Dimensione 4.21 MB
Formato Adobe PDF
4.21 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/305780
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 3
social impact