The development of robots, which can safely and effectively interact with people and assist them in structured environments, is an open research problem whose importance has been growing rapidly in the last years. Indeed working in shared environments with human beings, these robots require new ways to achieve human–robot interaction and cooperation. This work presents an approach for performing human–robot interaction by means of robotic manipulators. The interaction is composed by three main steps, namely the selection, the recognition and the grasping of an object. The object selection is recorded on the base of a gesture execution, realized by the user in front of a RGB-D camera and associated to each particular object. The object recognition is achieved by means of the RGB cameras mounted on the two manipulator arms, which send the workspace information to a specific classifier. With the aim of realizing the grasping step, the object position and orientation are extracted in order to correctly rotate the gripper according to the object on the desk in front of the robot. The final goal is to release the grasped object on the hand of the user standing in front of the desk. This system could support people with limited motor skills who are not able to get an object on their own, playing an important role in structured assistive and smart environments, thus promoting the human–robot interaction in activity of daily living.

Development and experimental validation of algorithms for human–robot interaction in simulated and real scenarios / Freddi, A.; Iarlori, S.; Longhi, S.; Monteriù, A.. - In: JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING. - ISSN 1868-5137. - ELETTRONICO. - (2020), pp. 1-11. [10.1007/s12652-019-01676-6]

Development and experimental validation of algorithms for human–robot interaction in simulated and real scenarios

Freddi, A.;Iarlori, S.;Longhi, S.;Monteriù, A.
2020-01-01

Abstract

The development of robots, which can safely and effectively interact with people and assist them in structured environments, is an open research problem whose importance has been growing rapidly in the last years. Indeed working in shared environments with human beings, these robots require new ways to achieve human–robot interaction and cooperation. This work presents an approach for performing human–robot interaction by means of robotic manipulators. The interaction is composed by three main steps, namely the selection, the recognition and the grasping of an object. The object selection is recorded on the base of a gesture execution, realized by the user in front of a RGB-D camera and associated to each particular object. The object recognition is achieved by means of the RGB cameras mounted on the two manipulator arms, which send the workspace information to a specific classifier. With the aim of realizing the grasping step, the object position and orientation are extracted in order to correctly rotate the gripper according to the object on the desk in front of the robot. The final goal is to release the grasped object on the hand of the user standing in front of the desk. This system could support people with limited motor skills who are not able to get an object on their own, playing an important role in structured assistive and smart environments, thus promoting the human–robot interaction in activity of daily living.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/272954
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact