The ability to detect an obstacle at sea and automatically classify it is crucial for effective and safe navigation. This work presents a method to enhance situational awareness in Marine Autonomous Surface Ships (MASS) equipped with multi-sensor perception systems through the late-fusion of data from RGB cameras and LiDAR sensors. The RGB camera data is processed using a pre-trained YOLOv8 object detector. LiDAR data is processed through a clustering-based detection pipeline and each detection is classified using a trained Random Forest Classifier. Multi-modal detection and classification are used to confirm the presence of a moving obstacle, whose position, obtained from the LiDAR sensor, is tracked over time using a Global Nearest Neighbour tracking algorithm based on a Linear Kalman Filter. The presented sensor fusion approach enhances object detection and classification, thereby improving situational awareness and decision-making capabilities in autonomous systems. Each step of the pipeline is explained in detail, and the system is validated on real data recorded in a harbor.

Marine Obstacles Multi-Modal Detection, Classification and Tracking via Camera-LiDAR Late Fusion

Ponzini, Filippo;Martelli, Michele
2025-01-01

Abstract

The ability to detect an obstacle at sea and automatically classify it is crucial for effective and safe navigation. This work presents a method to enhance situational awareness in Marine Autonomous Surface Ships (MASS) equipped with multi-sensor perception systems through the late-fusion of data from RGB cameras and LiDAR sensors. The RGB camera data is processed using a pre-trained YOLOv8 object detector. LiDAR data is processed through a clustering-based detection pipeline and each detection is classified using a trained Random Forest Classifier. Multi-modal detection and classification are used to confirm the presence of a moving obstacle, whose position, obtained from the LiDAR sensor, is tracked over time using a Global Nearest Neighbour tracking algorithm based on a Linear Kalman Filter. The presented sensor fusion approach enhances object detection and classification, thereby improving situational awareness and decision-making capabilities in autonomous systems. Each step of the pipeline is explained in detail, and the system is validated on real data recorded in a harbor.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1289816
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact