Fisheye omnidirectional stereo depth estimation assisted with edge-awareness

The wide field of view of fisheye cameras introduces significant image distortion, making accurate depth estimation more challenging compared to pinhole camera models. This paper proposes a fisheye camera panoramic depth estimation network based on edge awareness, aimed at improving depth estimation...

Full description

Saved in:
Bibliographic Details
Main Authors: Junren Sun, Hao Xue, Shibo Guo, Xunqi Zheng
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-05-01
Series:Frontiers in Physics
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fphy.2025.1555785/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850035452129574912
author Junren Sun
Hao Xue
Shibo Guo
Xunqi Zheng
author_facet Junren Sun
Hao Xue
Shibo Guo
Xunqi Zheng
author_sort Junren Sun
collection DOAJ
description The wide field of view of fisheye cameras introduces significant image distortion, making accurate depth estimation more challenging compared to pinhole camera models. This paper proposes a fisheye camera panoramic depth estimation network based on edge awareness, aimed at improving depth estimation accuracy for fisheye images. We design an Edge-Aware Module (EAM) that dynamically weights features extracted by a Residual Convolutional Neural Network (Residual CNN) using the extracted edge information. Subsequently, a spherical alignment method is used to map image features from different cameras to a unified spherical coordinate system. A cost volume is built for different depth hypotheses, which is then regularized using a 3D convolutional network. To address the issue of depth value discretization, we employ a hybrid classification and regression strategy: the classification branch predicts the probability distribution of depth categories, while the regression branch uses weighted linear interpolation to compute the final depth values based on these probabilities. Experimental results demonstrate that our method outperforms existing approaches in terms of depth estimation accuracy and object structure representation on the OmniThings, OmniHouse, and Urban Dataset (sunny). Therefore, our method provides a more accurate depth estimation solution for fisheye cameras, effectively handling the strong distortion inherent in fisheye images, with improved performance in both depth estimation and detail preservation.
format Article
id doaj-art-68f0e0e77a3747bcbc8847440cbf8cfa
institution DOAJ
issn 2296-424X
language English
publishDate 2025-05-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Physics
spelling doaj-art-68f0e0e77a3747bcbc8847440cbf8cfa2025-08-20T02:57:29ZengFrontiers Media S.A.Frontiers in Physics2296-424X2025-05-011310.3389/fphy.2025.15557851555785Fisheye omnidirectional stereo depth estimation assisted with edge-awarenessJunren SunHao XueShibo GuoXunqi ZhengThe wide field of view of fisheye cameras introduces significant image distortion, making accurate depth estimation more challenging compared to pinhole camera models. This paper proposes a fisheye camera panoramic depth estimation network based on edge awareness, aimed at improving depth estimation accuracy for fisheye images. We design an Edge-Aware Module (EAM) that dynamically weights features extracted by a Residual Convolutional Neural Network (Residual CNN) using the extracted edge information. Subsequently, a spherical alignment method is used to map image features from different cameras to a unified spherical coordinate system. A cost volume is built for different depth hypotheses, which is then regularized using a 3D convolutional network. To address the issue of depth value discretization, we employ a hybrid classification and regression strategy: the classification branch predicts the probability distribution of depth categories, while the regression branch uses weighted linear interpolation to compute the final depth values based on these probabilities. Experimental results demonstrate that our method outperforms existing approaches in terms of depth estimation accuracy and object structure representation on the OmniThings, OmniHouse, and Urban Dataset (sunny). Therefore, our method provides a more accurate depth estimation solution for fisheye cameras, effectively handling the strong distortion inherent in fisheye images, with improved performance in both depth estimation and detail preservation.https://www.frontiersin.org/articles/10.3389/fphy.2025.1555785/fullfisheye cameraomnidirectionaldepth estimationedge informationRCNNself-attention
spellingShingle Junren Sun
Hao Xue
Shibo Guo
Xunqi Zheng
Fisheye omnidirectional stereo depth estimation assisted with edge-awareness
Frontiers in Physics
fisheye camera
omnidirectional
depth estimation
edge information
RCNN
self-attention
title Fisheye omnidirectional stereo depth estimation assisted with edge-awareness
title_full Fisheye omnidirectional stereo depth estimation assisted with edge-awareness
title_fullStr Fisheye omnidirectional stereo depth estimation assisted with edge-awareness
title_full_unstemmed Fisheye omnidirectional stereo depth estimation assisted with edge-awareness
title_short Fisheye omnidirectional stereo depth estimation assisted with edge-awareness
title_sort fisheye omnidirectional stereo depth estimation assisted with edge awareness
topic fisheye camera
omnidirectional
depth estimation
edge information
RCNN
self-attention
url https://www.frontiersin.org/articles/10.3389/fphy.2025.1555785/full
work_keys_str_mv AT junrensun fisheyeomnidirectionalstereodepthestimationassistedwithedgeawareness
AT haoxue fisheyeomnidirectionalstereodepthestimationassistedwithedgeawareness
AT shiboguo fisheyeomnidirectionalstereodepthestimationassistedwithedgeawareness
AT xunqizheng fisheyeomnidirectionalstereodepthestimationassistedwithedgeawareness