DVF-NET: Bi-Temporal Remote Sensing Image Registration Network Based on Displacement Vector Field Fusion
Accurate image registration is essential for various remote sensing applications, particularly in multi-temporal image analysis. This paper introduces DVF-NET, a novel deep learning-based framework for dual-temporal remote sensing image registration. DVF-NET integrates two displacement vector fields...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-02-01
|
| Series: | Sensors |
| Subjects: | |
| Online Access: | https://www.mdpi.com/1424-8220/25/5/1380 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Accurate image registration is essential for various remote sensing applications, particularly in multi-temporal image analysis. This paper introduces DVF-NET, a novel deep learning-based framework for dual-temporal remote sensing image registration. DVF-NET integrates two displacement vector fields to address nonlinear distortions caused by significant variations between images, enabling more precise image alignment. A key innovation of this method is the incorporation of a Structural Attention Module (SAT), which enhances the model’s ability to focus on structural features, improving the feature extraction process. Additionally, we propose a novel loss function design that combines multiple similarity metrics, ensuring more comprehensive supervision during training. Experimental results on various remote sensing datasets indicate that the proposed DVF-NET outperforms the existing methods in both accuracy and robustness, particularly when handling images with substantial geometric distortions such as tilted buildings. The results validate the effectiveness of our approach and highlight its potential for various remote sensing tasks, including change detection, land cover classification, and environmental monitoring. DVF-NET provides a promising direction for the advancement of remote sensing image registration techniques, offering both high precision and robustness in complex real-world scenarios. |
|---|---|
| ISSN: | 1424-8220 |