Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry Processing
The poultry industry plays a pivotal role in global agriculture, with poultry serving as a major source of protein and contributing significantly to economic growth. However, the sector faces challenges associated with labor-intensive tasks that are repetitive and physically demanding. Automation ha...
Saved in:
| Main Authors: | , , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-03-01
|
| Series: | AgriEngineering |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2624-7402/7/3/77 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850094186908352512 |
|---|---|
| author | Pouya Sohrabipour Chaitanya Kumar Reddy Pallerla Amirreza Davar Siavash Mahmoudi Philip Crandall Wan Shou Yu She Dongyi Wang |
| author_facet | Pouya Sohrabipour Chaitanya Kumar Reddy Pallerla Amirreza Davar Siavash Mahmoudi Philip Crandall Wan Shou Yu She Dongyi Wang |
| author_sort | Pouya Sohrabipour |
| collection | DOAJ |
| description | The poultry industry plays a pivotal role in global agriculture, with poultry serving as a major source of protein and contributing significantly to economic growth. However, the sector faces challenges associated with labor-intensive tasks that are repetitive and physically demanding. Automation has emerged as a critical solution to enhance operational efficiency and improve working conditions. Specifically, robotic manipulation and handling of objects is becoming ubiquitous in factories. However, challenges exist to precisely identify and guide a robot to handle a pile of objects with similar textures and colors. This paper focuses on the development of a vision system for a robotic solution aimed at automating the chicken rehanging process, a fundamental yet physically strenuous activity in poultry processing. To address the limitation of the generic instance segmentation model in identifying overlapped objects, a cost-effective, dual-active laser scanning system was developed to generate precise depth data on objects. The well-registered depth data generated were integrated with the RGB images and sent to the instance segmentation model for individual chicken detection and identification. This enhanced approach significantly improved the model’s performance in handling complex scenarios involving overlapping chickens. Specifically, the integration of RGB-D data increased the model’s mean average precision (mAP) detection accuracy by 4.9% and significantly improved the center offset—a customized metric introduced in this study to quantify the distance between the ground truth mask center and the predicted mask center. Precise center detection is crucial for the development of future robotic control solutions, as it ensures accurate grasping during the chicken rehanging process. The center offset was reduced from 22.09 pixels (7.30 mm) to 8.09 pixels (2.65 mm), demonstrating the approach’s effectiveness in mitigating occlusion challenges and enhancing the reliability of the vision system. |
| format | Article |
| id | doaj-art-9a71cc3bfd244478bb90c7f9e30e3b42 |
| institution | DOAJ |
| issn | 2624-7402 |
| language | English |
| publishDate | 2025-03-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | AgriEngineering |
| spelling | doaj-art-9a71cc3bfd244478bb90c7f9e30e3b422025-08-20T02:41:43ZengMDPI AGAgriEngineering2624-74022025-03-01737710.3390/agriengineering7030077Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry ProcessingPouya Sohrabipour0Chaitanya Kumar Reddy Pallerla1Amirreza Davar2Siavash Mahmoudi3Philip Crandall4Wan Shou5Yu She6Dongyi Wang7Department of Biological and Agricultural Engineering, University of Arkansas, Faytteville, AR 72701, USADepartment of Food Science, University of Arkansas, Fayetteville, AR 72701, USADepartment of Mechanical Engineering, University of Arkansas, Fayetteville, AR 72701, USADepartment of Biological and Agricultural Engineering, University of Arkansas, Faytteville, AR 72701, USADepartment of Food Science, University of Arkansas, Fayetteville, AR 72701, USADepartment of Mechanical Engineering, University of Arkansas, Fayetteville, AR 72701, USADepartment of Industrial Engineering, Purdue University, West Lafayette, IN 47907, USADepartment of Biological and Agricultural Engineering, University of Arkansas, Faytteville, AR 72701, USAThe poultry industry plays a pivotal role in global agriculture, with poultry serving as a major source of protein and contributing significantly to economic growth. However, the sector faces challenges associated with labor-intensive tasks that are repetitive and physically demanding. Automation has emerged as a critical solution to enhance operational efficiency and improve working conditions. Specifically, robotic manipulation and handling of objects is becoming ubiquitous in factories. However, challenges exist to precisely identify and guide a robot to handle a pile of objects with similar textures and colors. This paper focuses on the development of a vision system for a robotic solution aimed at automating the chicken rehanging process, a fundamental yet physically strenuous activity in poultry processing. To address the limitation of the generic instance segmentation model in identifying overlapped objects, a cost-effective, dual-active laser scanning system was developed to generate precise depth data on objects. The well-registered depth data generated were integrated with the RGB images and sent to the instance segmentation model for individual chicken detection and identification. This enhanced approach significantly improved the model’s performance in handling complex scenarios involving overlapping chickens. Specifically, the integration of RGB-D data increased the model’s mean average precision (mAP) detection accuracy by 4.9% and significantly improved the center offset—a customized metric introduced in this study to quantify the distance between the ground truth mask center and the predicted mask center. Precise center detection is crucial for the development of future robotic control solutions, as it ensures accurate grasping during the chicken rehanging process. The center offset was reduced from 22.09 pixels (7.30 mm) to 8.09 pixels (2.65 mm), demonstrating the approach’s effectiveness in mitigating occlusion challenges and enhancing the reliability of the vision system.https://www.mdpi.com/2624-7402/7/3/77poultryprecision food manufacturingmeat processinginstance segmentationactive laser scanning |
| spellingShingle | Pouya Sohrabipour Chaitanya Kumar Reddy Pallerla Amirreza Davar Siavash Mahmoudi Philip Crandall Wan Shou Yu She Dongyi Wang Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry Processing AgriEngineering poultry precision food manufacturing meat processing instance segmentation active laser scanning |
| title | Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry Processing |
| title_full | Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry Processing |
| title_fullStr | Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry Processing |
| title_full_unstemmed | Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry Processing |
| title_short | Cost-Effective Active Laser Scanning System for Depth-Aware Deep-Learning-Based Instance Segmentation in Poultry Processing |
| title_sort | cost effective active laser scanning system for depth aware deep learning based instance segmentation in poultry processing |
| topic | poultry precision food manufacturing meat processing instance segmentation active laser scanning |
| url | https://www.mdpi.com/2624-7402/7/3/77 |
| work_keys_str_mv | AT pouyasohrabipour costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing AT chaitanyakumarreddypallerla costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing AT amirrezadavar costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing AT siavashmahmoudi costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing AT philipcrandall costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing AT wanshou costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing AT yushe costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing AT dongyiwang costeffectiveactivelaserscanningsystemfordepthawaredeeplearningbasedinstancesegmentationinpoultryprocessing |