Research on a Multi-Type Barcode Defect Detection Model Based on Machine Vision
Barcodes are ubiquitous in manufacturing and logistics, but defects can reduce decoding efficiency and disrupt the supply chain. Existing studies primarily focus on a single barcode type or rely on small-scale datasets, limiting generalizability. We propose Y8-LiBAR Net, a lightweight two-stage fram...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-07-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/15/8176 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Barcodes are ubiquitous in manufacturing and logistics, but defects can reduce decoding efficiency and disrupt the supply chain. Existing studies primarily focus on a single barcode type or rely on small-scale datasets, limiting generalizability. We propose Y8-LiBAR Net, a lightweight two-stage framework for multi-type barcode defect detection. In stage 1, a YOLOv8n backbone localizes 1D and 2D barcodes in real time. In stage 2, a dual-branch network integrating ResNet50 and ViT-B/16 via hierarchical attention performs three-class classification on cropped regions of interest (ROIs): intact, defective, and non-barcode. Experiments conducted on the public BarBeR dataset, covering planar/non-planar surfaces, varying illumination, and sensor noise, show that Y8-LiBARNet achieves a detection-stage mAP@0.5 = 0.984 (1D: 0.992; 2D: 0.977) with a peak F1 score of 0.970. Subsequent defect classification attains 0.925 accuracy, 0.925 recall, and a 0.919 F1 score. Compared with single-branch baselines, our framework improves overall accuracy by 1.8–3.4% and enhances defective barcode recall by 8.9%. A Cohen’s kappa of 0.920 indicates strong label consistency and model robustness. These results demonstrate that Y8-LiBARNet delivers high-precision real-time performance, providing a practical solution for industrial barcode quality inspection. |
|---|---|
| ISSN: | 2076-3417 |