Deep learning-based object detection for environmental monitoring using big data

IntroductionRecent advances in artificial intelligence have transformed the way we analyze complex environmental data. However, high-dimensionality, spatiotemporal variability, and heterogeneous data sources continue to pose major challenges.MethodsIn this work, we introduce the Environmental Graph-...

Full description

Saved in:
Bibliographic Details
Main Authors: Wenbo Lin, Tingting Li, Xiao Li
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-06-01
Series:Frontiers in Environmental Science
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fenvs.2025.1566224/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:IntroductionRecent advances in artificial intelligence have transformed the way we analyze complex environmental data. However, high-dimensionality, spatiotemporal variability, and heterogeneous data sources continue to pose major challenges.MethodsIn this work, we introduce the Environmental Graph-Aware Neural Network (EGAN), a novel framework designed to model and analyze large-scale, multi-modal environmental datasets. EGAN constructs a spatiotemporal graph representation that integrates physical proximity, ecological similarity, and temporal dynamics, and applies graph convolutional encoders to learn expressive spatial features. These are fused with temporal representations using attention mechanisms, enabling the model to dynamically capture relevant patterns across modalities. The framework is further enhanced by domain-informed learning strategies that incorporate physics-based constraints, meta-learning for regional adaptation, and uncertainty-aware predictions.ResultsExtensive experiments on four benchmark datasets demonstrate that our approach achieves state-of-the-art performance in environmental object detection, segmentation, and scene understanding.DiscussionEGAN is shown to be a robust and interpretable tool for real-world environmental monitoring applications.
ISSN:2296-665X