FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing images
Change detection (CD) in remote sensing (RS) images aims to identify surface changes based on images acquired at different times. However, existing methods are still unsatisfactory in locating fine details of change in RS images, due to overlooking the inherent temporal information. To address the i...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Taylor & Francis Group
2024-01-01
|
| Series: | Geocarto International |
| Subjects: | |
| Online Access: | https://www.tandfonline.com/doi/10.1080/10106049.2024.2353253 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850163830360899584 |
|---|---|
| author | Wei Wu Tong Li Qi Xuan QiMing Wan Zuohui Chen |
| author_facet | Wei Wu Tong Li Qi Xuan QiMing Wan Zuohui Chen |
| author_sort | Wei Wu |
| collection | DOAJ |
| description | Change detection (CD) in remote sensing (RS) images aims to identify surface changes based on images acquired at different times. However, existing methods are still unsatisfactory in locating fine details of change in RS images, due to overlooking the inherent temporal information. To address the issue, we introduce a novel Triplet Fusion Temporal Relationship Network (FTRNet). FTRNet incorporates a triplet input backbone that enables the extraction of both spatial and temporal features. We design a change attention module to enhance bitemporal features, making the backbone network retain temporal information and fuse cross-scale features to extract the high-level location information. We evaluate our method on three benchmark datasets, including LEVIR-CD, WHU-CD, GZ, and DSIFN. The experimental results showcase that FTRNet achieves IoU scores of 83.60%, 77.06%, 73.67%, and 77.30% in LEVIR-CD, WHU-CD, GZ, and DSIFN datasets, respectively. These results surpass the second-best baseline by 1.20%, 0.49%, 1.31%, and 1.20%, respectively. |
| format | Article |
| id | doaj-art-2c16713875dc445491480cc18c1bb480 |
| institution | OA Journals |
| issn | 1010-6049 1752-0762 |
| language | English |
| publishDate | 2024-01-01 |
| publisher | Taylor & Francis Group |
| record_format | Article |
| series | Geocarto International |
| spelling | doaj-art-2c16713875dc445491480cc18c1bb4802025-08-20T02:22:09ZengTaylor & Francis GroupGeocarto International1010-60491752-07622024-01-0139110.1080/10106049.2024.2353253FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing imagesWei Wu0Tong Li1Qi Xuan2QiMing Wan3Zuohui Chen4College of Computer Science and Technology, Zhejiang University of Technology, Hangzhou, ChinaCollege of Computer Science and Technology, Zhejiang University of Technology, Hangzhou, ChinaInstitute of Cyberspace Security, Zhejiang University of Technology, Hangzhou, ChinaHikvision Research Institute, Hangzhou Hikvision Digital Technology Co Ltd, Qianmo Road, Hangzhou, Zhejiang, ChinaInstitute of Cyberspace Security, Zhejiang University of Technology, Hangzhou, ChinaChange detection (CD) in remote sensing (RS) images aims to identify surface changes based on images acquired at different times. However, existing methods are still unsatisfactory in locating fine details of change in RS images, due to overlooking the inherent temporal information. To address the issue, we introduce a novel Triplet Fusion Temporal Relationship Network (FTRNet). FTRNet incorporates a triplet input backbone that enables the extraction of both spatial and temporal features. We design a change attention module to enhance bitemporal features, making the backbone network retain temporal information and fuse cross-scale features to extract the high-level location information. We evaluate our method on three benchmark datasets, including LEVIR-CD, WHU-CD, GZ, and DSIFN. The experimental results showcase that FTRNet achieves IoU scores of 83.60%, 77.06%, 73.67%, and 77.30% in LEVIR-CD, WHU-CD, GZ, and DSIFN datasets, respectively. These results surpass the second-best baseline by 1.20%, 0.49%, 1.31%, and 1.20%, respectively.https://www.tandfonline.com/doi/10.1080/10106049.2024.2353253Change detectionremote sensingtriple networktransformerattention mechanism |
| spellingShingle | Wei Wu Tong Li Qi Xuan QiMing Wan Zuohui Chen FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing images Geocarto International Change detection remote sensing triple network transformer attention mechanism |
| title | FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing images |
| title_full | FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing images |
| title_fullStr | FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing images |
| title_full_unstemmed | FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing images |
| title_short | FTRNet: triplet fusion temporal relationship network for change detection in bitemporal remote sensing images |
| title_sort | ftrnet triplet fusion temporal relationship network for change detection in bitemporal remote sensing images |
| topic | Change detection remote sensing triple network transformer attention mechanism |
| url | https://www.tandfonline.com/doi/10.1080/10106049.2024.2353253 |
| work_keys_str_mv | AT weiwu ftrnettripletfusiontemporalrelationshipnetworkforchangedetectioninbitemporalremotesensingimages AT tongli ftrnettripletfusiontemporalrelationshipnetworkforchangedetectioninbitemporalremotesensingimages AT qixuan ftrnettripletfusiontemporalrelationshipnetworkforchangedetectioninbitemporalremotesensingimages AT qimingwan ftrnettripletfusiontemporalrelationshipnetworkforchangedetectioninbitemporalremotesensingimages AT zuohuichen ftrnettripletfusiontemporalrelationshipnetworkforchangedetectioninbitemporalremotesensingimages |