An Edge-Computing-Driven Approach for Augmented Detection of Construction Materials: An Example of Scaffold Component Counting

Construction material management is crucial for project progression. Counting massive amounts of scaffold components is a key step for efficient material management. However, traditional counting methods are time-consuming and laborious. Utilizing a vision-based method with edge devices for counting...

Full description

Saved in:
Bibliographic Details
Main Authors: Xianzhong Zhao, Bo Cheng, Yujie Lu, Zhaoqi Huang
Format: Article
Language:English
Published: MDPI AG 2025-04-01
Series:Buildings
Subjects:
Online Access:https://www.mdpi.com/2075-5309/15/7/1190
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Construction material management is crucial for project progression. Counting massive amounts of scaffold components is a key step for efficient material management. However, traditional counting methods are time-consuming and laborious. Utilizing a vision-based method with edge devices for counting these materials undoubtedly offers a promising solution. This study proposed an edge-computing-driven approach for detecting and counting scaffold components. Two algorithm refinements of YOLOX, including generalized intersection over union (GIoU) and soft non-maximum suppression (Soft-NMS), were introduced to enhance detection accuracy in conditions of occlusion. An automated pruning method was proposed to compress the model, achieving a 60.2% reduction in computation and a 9.1% increase in inference speed. Two practical case studies demonstrated that the method, when deployed on edge devices, achieved 98.9% accuracy and reduced time consumption for counting tasks by 87.9% compared to the conventional method. This research provides an edge-computing-driven framework for counting massive materials, establishing a comprehensive workflow for intelligent applications in construction management. The paper concludes with limitations of the current study and suggestions for future work.
ISSN:2075-5309