Novelty Detection in Autonomous Driving: A Generative Multi-Modal Sensor Fusion Approach

This paper presents a bio-inspired, generative Multi-Modal Sensor Fusion (MSF) framework to effectively detecting novel and dynamic situations in the surroundings of Autonomous Vehicle (AV). The MSF framework fuses both proprioceptive (wheel odometry) and exteroceptive (LiDAR point-clouds) sensory i...

Full description

Saved in:
Bibliographic Details
Main Authors: Hafsa Iqbal, Haleema Sadia, Abdulla Al-Kaff, Fernando Garcie
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Open Journal of Intelligent Transportation Systems
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11037519/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a bio-inspired, generative Multi-Modal Sensor Fusion (MSF) framework to effectively detecting novel and dynamic situations in the surroundings of Autonomous Vehicle (AV). The MSF framework fuses both proprioceptive (wheel odometry) and exteroceptive (LiDAR point-clouds) sensory inputs. A novel 3-Dimensional Dynamic Variational Auto-Encoder (3D-DVAE) model is employed to learn attention-focused distributions from point-clouds in an unsupervised manner. By fusing the distributions of both modalities (wheel and lidar), modality-specific experts’ distributions are learned, capturing both proprioceptive and exteroceptive information from the surroundings. Bayesian Filtering is then applied to detect novel situations/dynamics by probabilistically inferring future states. The proposed method is validated using the KITTI dataset across diverse and complex urban environments. Both quantitative and qualitative results demonstrate the effectiveness of the proposed approach in detecting novelties through multi-modal fusion.
ISSN:2687-7813