This article presents a novel control algorithm for robotic manipulators in unstructured environments using proximity sensors partially distributed on the platform. The proposed approach exploits arrays of multizone Time-of-Flight (ToF) sensors to generate a sparse point cloud representation of the robot’s surroundings. By employing computational geometry techniques, we fuse the knowledge of robot’s geometric model with ToFs sensory feedback to generate whole-arm motion tasks. This allows to move both sensorized and nonsensorized links in response to unpredictable events such as moving obstacles. In particular, the proposed algorithm computes the pair of closest points between the environment cloud and the robot links, generating a reactive avoidance motion that is implemented as the highest priority task in a two-level hierarchical architecture. The algorithm allows the robot to move safely in unstructured environments, even without complete sensorization over the whole surface. Experimental validation demonstrates the algorithm’s effectiveness in different operating scenarios, achieving comparable performances with respect to well established control techniques that rely on large area sensing architectures. We validate the approach on a human–robot interaction task involving 20 participants. In addition, we carry out a computational load analysis of the proposed algorithms. Our approach shows improvements in the distance margin between the robot and the environment up to 100 mm, due to the rendering of virtual avoidance tasks on nonsensorized links.
Generating Whole-Arm Avoidance Motion Through Localized Proximity Sensing
Simone Borelli;Francesco Giovinazzo;Alessandro Albini;Francesco Grella;Giorgio Cannata
2025-01-01
Abstract
This article presents a novel control algorithm for robotic manipulators in unstructured environments using proximity sensors partially distributed on the platform. The proposed approach exploits arrays of multizone Time-of-Flight (ToF) sensors to generate a sparse point cloud representation of the robot’s surroundings. By employing computational geometry techniques, we fuse the knowledge of robot’s geometric model with ToFs sensory feedback to generate whole-arm motion tasks. This allows to move both sensorized and nonsensorized links in response to unpredictable events such as moving obstacles. In particular, the proposed algorithm computes the pair of closest points between the environment cloud and the robot links, generating a reactive avoidance motion that is implemented as the highest priority task in a two-level hierarchical architecture. The algorithm allows the robot to move safely in unstructured environments, even without complete sensorization over the whole surface. Experimental validation demonstrates the algorithm’s effectiveness in different operating scenarios, achieving comparable performances with respect to well established control techniques that rely on large area sensing architectures. We validate the approach on a human–robot interaction task involving 20 participants. In addition, we carry out a computational load analysis of the proposed algorithms. Our approach shows improvements in the distance margin between the robot and the environment up to 100 mm, due to the rendering of virtual avoidance tasks on nonsensorized links.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.



