SAASNets: Shared attention aggregation Siamese networks for building change detection in multispectral remote sensing.

Interfered by external factors, the receptive field limits the traditional CNN multispectral remote sensing building change detection method. It is difficult to obtain detailed building changes entirely, and redundant information is reused in the encoding stage, which reduces the feature representat...

Full description

Saved in:
Bibliographic Details
Main Authors: Shuai Pang, Chaochao You, Min Zhang, Baojie Zhang, Liyou Wang, Xiaolong Shi, Yu Sun
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0306755
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Interfered by external factors, the receptive field limits the traditional CNN multispectral remote sensing building change detection method. It is difficult to obtain detailed building changes entirely, and redundant information is reused in the encoding stage, which reduces the feature representation and detection performance. To address these limitations, we design a Siamese network of shared attention aggregation to learn the detailed semantics of buildings in multispectral remote sensing images. On the one hand, a special attention embedding module is introduced into each subspace of the feature extractor to promote the interaction between multi-scale local features and enhance the representation of global features. On the other hand, a highly efficient channel and position multi-head attention module is added to the Siamese features to encode position details while sharing channel information. In addition, adopting a feature aggregation module with a residual strategy to fuse the features of different stages of the Siamese network is beneficial for detecting different scales and irregular object buildings. Finally, experimental results on LEVIR-CD and CDD datasets show that designed SAASNets have better accuracy and robustness.
ISSN:1932-6203