Enhancing Autonomous Driving Perception: A Practical Approach to Event-Based Object Detection in CARLA and ROS

Robust object detection in autonomous driving is challenged by inherent limitations of conventional frame-based cameras, such as motion blur and limited dynamic range. In contrast, event-based cameras, which operate asynchronously and capture rapid changes with high temporal resolution and expansive...

Full description

Saved in:
Bibliographic Details
Main Authors: Jingxiang Feng, Peiran Zhao, Haoran Zheng, Jessada Konpang, Adisorn Sirikham, Phuri Kalnaowakul
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Vehicles
Subjects:
Online Access:https://www.mdpi.com/2624-8921/7/2/53
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Robust object detection in autonomous driving is challenged by inherent limitations of conventional frame-based cameras, such as motion blur and limited dynamic range. In contrast, event-based cameras, which operate asynchronously and capture rapid changes with high temporal resolution and expansive dynamic range, offer a promising augmentation. While the previous research on event-based object detection has predominantly focused on algorithmic enhancements via advanced preprocessing and network optimizations to improve detection accuracy, the practical engineering and integration challenges of deploying these sensors in real-world systems remain underexplored. To address this gap, our study investigates the integration of event-based cameras as a complementary sensor modality in autonomous driving. We adapted a conventional frame-based detection model (YOLOv8) for event-based inputs by training it on the GEN1 dataset, achieving a mean average precision (mAP) of 70.1%, a significant improvement over previous benchmarks. Additionally, we developed a real-time object detection pipeline optimized for event-based data, integrating it into the CARLA simulation environment and ROS for system prototyping. The model was further refined using transfer learning to better adapt to simulation conditions, and the complete pipeline was validated across diverse simulated scenarios to address practical challenges. These results underscore the feasibility of incorporating event cameras into existing perception systems, paving the way for their broader deployment in autonomous vehicle applications.
ISSN:2624-8921