Assessing Deep Learning Techniques for Remote Gauging and Water Quality Monitoring Using Webcam Images

River and stream gauging and water quality monitoring are essential for understanding and managing freshwater resources. The U.S. Geological Survey (USGS) has been implementing and expanding the coverage of webcams across the U.S. stream gauges. A publicly available website has been established, kno...

Full description

Saved in:
Bibliographic Details
Main Authors: Ruichen Xu, Binbin Wang
Format: Article
Language:English
Published: MDPI AG 2025-03-01
Series:Hydrology
Subjects:
Online Access:https://www.mdpi.com/2306-5338/12/4/65
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:River and stream gauging and water quality monitoring are essential for understanding and managing freshwater resources. The U.S. Geological Survey (USGS) has been implementing and expanding the coverage of webcams across the U.S. stream gauges. A publicly available website has been established, known as the Hydrological Imagery Visualization and Information System (HIVIS). Motivated by routine webcam monitoring and recent advances in image-based machine learning research, in this technical paper, we evaluate three convolutional neural network (CNN) models including two deep neural network models (CNN3, VGG16, and ResNet50) in predicting river gauging, turbidity, dissolved oxygen, and dissolved organic matter in the Missouri River. We select the Missouri River due to the logistical challenges in field data collection. Our objective is to evaluate the predictability of the selected CNN and deep CNN models in inferring water surface elevation and water quality parameters from webcam images. The results show that the images can provide robust prediction for gauge height, a reasonable prediction for dissolved oxygen, and unsatisfactory prediction for turbidity or dissolved organic matter. The results demonstrate the potential use and limitation of using webcam images to remotely sense water quantity and quality data.
ISSN:2306-5338