OCAFI: observation centric appearance free interpolation technique for multi-object tracking

Multi-object tracking involves detecting and linking individuals over time while assigning unique IDs. However, traditional motion models often assume linear trajectories and require continuous observations, making them vulnerable to occlusions and non-linear motion. Additionally, tracking individua...

Full description

Saved in:
Bibliographic Details
Main Authors: Shavantrevva Bilakeri, Karunakar A. Kotegar
Format: Article
Language:English
Published: Taylor & Francis Group 2025-12-01
Series:Cogent Engineering
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/23311916.2025.2532808
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multi-object tracking involves detecting and linking individuals over time while assigning unique IDs. However, traditional motion models often assume linear trajectories and require continuous observations, making them vulnerable to occlusions and non-linear motion. Additionally, tracking individuals with similar appearances remains a challenge. The OC-SORT tracker recently introduced an improved Kalman Filter (KF)-based motion model, enhancing performance but increasing fragmentations and ID switches, which negatively affect track quality and IDF1 scores. To address these issues, we re-evaluate and enhance OC-SORT across multiple components, including detection, data association, and post-processing. We incorporate an appearance-free linking model to merge short tracklets into complete trajectories. To recover missing detections and refine localization, we apply Gaussian smoothed interpolation during post-processing. The resulting tracker, named OCAFI (Observation-Centric Appearance-Free Interpolated tracker), effectively combines motion, spatiotemporal cues, an improved KF, and robust post-processing. These enhancements significantly improve HOTA, IDF1, and overall track quality on benchmark MOT datasets. OCAFI achieves a MOTA of 80.5% on MOT17, 77.8% on MOT20, and 92.1% on DanceTrack, demonstrating its superior accuracy and robustness in challenging multi-object tracking scenarios.
ISSN:2331-1916