Towards Anytime Optical Flow Estimation with Event Cameras

Event cameras respond to changes in log-brightness at the millisecond level, making them ideal for optical flow estimation. However, existing datasets from event cameras provide only low-frame-rate ground truth for optical flow, limiting the research potential of event-driven optical flow. To addres...

Full description

Saved in:
Bibliographic Details
Main Authors: Yaozu Ye, Hao Shi, Kailun Yang, Ze Wang, Xiaoting Yin, Lei Sun, Yaonan Wang, Kaiwei Wang
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/10/3158
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Event cameras respond to changes in log-brightness at the millisecond level, making them ideal for optical flow estimation. However, existing datasets from event cameras provide only low-frame-rate ground truth for optical flow, limiting the research potential of event-driven optical flow. To address this challenge, we introduce a low-latency event representation, <i>unified voxel grid (UVG)</i>, and propose <i>EVA-Flow</i>, an <i>EV</i>ent-based <i>A</i>nytime <i>Flow</i> estimation network to produce high-frame-rate event optical flow with only low-frame-rate optical flow ground truth for supervision. Furthermore, we propose <i>rectified flow warp loss (RFWL)</i> for the unsupervised assessment of intermediate optical flow. A comprehensive variety of experiments on MVSEC, DESC, and our EVA-FlowSet demonstrates that EVA-Flow achieves competitive performance, super-low-latency (5 ms), time-dense motion estimation (200 Hz), and strong generalization.
ISSN:1424-8220