Non-local spatiotemporal fusion incorporating optical flow estimation and spatial-spectral mitigation-based residual

Gaining high spatiotemporal images by fusion has emerged as an essential pre-processing step to integrate fine-spatial-low-temporal resolution (fine-resolution) and coarse-spatial-high-temporal resolution (coarse-resolution) images. However, few weighted-based methods account for dynamic change esti...

Full description

Saved in:
Bibliographic Details
Main Authors: Yaxu Wang, Xiaobo Luo, Hongwei Ye
Format: Article
Language:English
Published: Taylor & Francis Group 2025-12-01
Series:Geocarto International
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/10106049.2025.2532527
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Gaining high spatiotemporal images by fusion has emerged as an essential pre-processing step to integrate fine-spatial-low-temporal resolution (fine-resolution) and coarse-spatial-high-temporal resolution (coarse-resolution) images. However, few weighted-based methods account for dynamic change estimation in spatiotemporal weighting, nor do they adequately consider both spatial and spectral sensor differences in residual calculation. To address these issues, this study proposes a non-local weighted-based spatiotemporal fusion model, termed FESFM, incorporating optical flow estimation-based spatiotemporal weighting and spatial-spectral difference mitigation-based residual calculation, for fusion multi-sensor vegetation images. In FESFM, optical flow estimation-based temporal weight is linearly combined with a non-local guided filter for spatiotemporal weighting. For spatial-spectral residual calculation, cubic spline interpolation, accompanied by a scale factor adjustment value, is implemented to mitigate ‘blurred’ pixel issue. Furthermore, a spectral difference mitigation-based factor is implemented to reduce spectral distortion. Extensive experiments demonstrate that our method outperforms other weighted-based methods visually and quantitatively.
ISSN:1010-6049
1752-0762