A benchmark dataset for class-wise segmentation of construction and demolition waste in cluttered environments
Abstract Efficient management of construction and demolition waste (CDW) is essential for enhancing resource recovery. The lack of publicly available, high-quality datasets for waste recognition limits the development and adoption of automated waste handling solutions. To facilitate data sharing and...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-05-01
|
| Series: | Scientific Data |
| Online Access: | https://doi.org/10.1038/s41597-025-05243-x |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Efficient management of construction and demolition waste (CDW) is essential for enhancing resource recovery. The lack of publicly available, high-quality datasets for waste recognition limits the development and adoption of automated waste handling solutions. To facilitate data sharing and reuse, this study introduces ‘CDW-Seg’, a benchmark dataset for class-wise segmentation of CDW. The dataset comprises high-resolution images captured at authentic construction sites, featuring skip bins filled with a diverse mixture of CDW materials in-the-wild. It includes 5,413 manually annotated objects across ten categories: concrete, fill dirt, timber, hard plastic, soft plastic, steel, fabric, cardboard, plasterboard, and the skip bin, representing a total of 2,492,021,189 pixels. Each object was meticulously annotated through semantic segmentation, providing reliable ground-truth labels. To demonstrate the applicability of the dataset, an adapter-based fine-tuning approach was implemented using a hierarchical Vision Transformer, ensuring computational efficiency suitable for deployment in automated waste handling scenarios. The CDW-Seg has been made publicly accessible to promote data sharing, facilitate further research, and support the development of automated solutions for resource recovery. |
|---|---|
| ISSN: | 2052-4463 |