Parcel Segmentation Method Combined YOLOV5s and Segment Anything Model Using Remote Sensing Image
Accurate land parcel segmentation in remote sensing imagery is critical for applications such as land use analysis, agricultural monitoring, and urban planning. However, existing methods often underperform in complex scenes due to small-object segmentation challenges, blurred boundaries, and backgro...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-07-01
|
| Series: | Land |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2073-445X/14/7/1429 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Accurate land parcel segmentation in remote sensing imagery is critical for applications such as land use analysis, agricultural monitoring, and urban planning. However, existing methods often underperform in complex scenes due to small-object segmentation challenges, blurred boundaries, and background interference, often influenced by sensor resolution and atmospheric variation. To address these limitations, we propose a dual-stage framework that combines an enhanced YOLOv5s detector with the Segment Anything Model (SAM) to improve segmentation accuracy and robustness. The improved YOLOv5s module integrates Efficient Channel Attention (ECA) and BiFPN to boost feature extraction and small-object recognition, while Soft-NMS is used to reduce missed detections. The SAM module receives bounding-box prompts from YOLOv5s and incorporates morphological refinement and mask stability scoring for improved boundary continuity and mask quality. A composite Focal-Dice loss is applied to mitigate class imbalance. In addition to the publicly available CCF BDCI dataset, we constructed a new WuJiang dataset to evaluate cross-domain performance. Experimental results demonstrate that our method achieves an IoU of 89.8% and a precision of 90.2%, outperforming baseline models and showing strong generalizability across diverse remote sensing conditions. |
|---|---|
| ISSN: | 2073-445X |