Advancing Seabed Bedform Mapping in the Kuźnica Deep: Leveraging Multibeam Echosounders and Machine Learning for Enhanced Underwater Landscape Analysis

The ocean, covering 71% of Earth’s surface, remains largely unexplored due to the challenges of the marine environment. This study focuses on the Kuźnica Deep in the Baltic Sea, aiming to develop an automatic seabed mapping methodology using multibeam echosounders (MBESs) and machine learning. The r...

Full description

Saved in:
Bibliographic Details
Main Author: Łukasz Janowski
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/17/3/373
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The ocean, covering 71% of Earth’s surface, remains largely unexplored due to the challenges of the marine environment. This study focuses on the Kuźnica Deep in the Baltic Sea, aiming to develop an automatic seabed mapping methodology using multibeam echosounders (MBESs) and machine learning. The research integrates various scientific fields to enhance understanding of the Kuźnica Deep’s underwater landscape, addressing sediment composition, backscatter intensity, and geomorphometric features. Advances in remote sensing, particularly, object-based image analysis (OBIA) and machine learning, have significantly improved geospatial data analysis for underwater landscapes. The study highlights the importance of using a reduced set of relevant features for training models, as identified by the Boruta algorithm, to improve accuracy and robustness. Key geomorphometric features were crucial for seafloor composition mapping, while textural features were less significant. The study found that models with fewer, carefully selected features performed better, reducing overfitting and computational complexity. The findings support hydrographic, ecological, and geological research by providing reliable seabed composition maps and enhancing decision-making and hypothesis generation.
ISSN:2072-4292