PoulTrans: a transformer-based model for accurate poultry condition assessment

Abstract Recent advances in deep learning have significantly enhanced the accuracy of poultry image recognition, particularly in assessing poultry conditions. However, developing intuitive decision support tools remain a significant challenge. To address this, we present PoulTrans, an innovative ima...

Full description

Saved in:
Bibliographic Details
Main Authors: Jun Li, Bing Yang, Junyang Chen, Jiaxin Liu, Felix Kwame Amevor, Guanyu Chen, Buyuan Zhang, Xiaoling Zhao
Format: Article
Language:English
Published: Nature Portfolio 2025-04-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-98078-w
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850172412288565248
author Jun Li
Bing Yang
Junyang Chen
Jiaxin Liu
Felix Kwame Amevor
Guanyu Chen
Buyuan Zhang
Xiaoling Zhao
author_facet Jun Li
Bing Yang
Junyang Chen
Jiaxin Liu
Felix Kwame Amevor
Guanyu Chen
Buyuan Zhang
Xiaoling Zhao
author_sort Jun Li
collection DOAJ
description Abstract Recent advances in deep learning have significantly enhanced the accuracy of poultry image recognition, particularly in assessing poultry conditions. However, developing intuitive decision support tools remain a significant challenge. To address this, we present PoulTrans, an innovative image captioning framework that leverages a Convolutional Neural Network (CNN) integrated with a CSA_Encoder-Transformer architecture to generate detailed poultry status reports. This model incorporates visual features extracted by CNNs into the Channel Spatial Attention Segmentation Encoder (CSA_Encoder), which produces segmented channel and spatial attention outputs. To optimize multi-level attention and improve the semantic precision of the status descriptions, we introduced a Channel Spatial Memory-Guided Transformer (CSMT) and a novel PS-Loss function. The performance of PoulTrans was tested on the PSC-Captions dataset, achieving top scores of 0.501, 0.803, 4.927, 0.608, and 1.882 for the BLEU-4, ROUGE-L, CIDEr, SPICE, and Sm metrics, respectively. Comprehensive analyses and experiments have validated the effectiveness and reliability of our model, providing advanced tools for automated poultry status generation and enhancing the digital experience for poultry farmers. Our code is available at: https://github.com/kong1107800/PoulTrans .
format Article
id doaj-art-8f06fb13b186400683135cd61ed2cf33
institution OA Journals
issn 2045-2322
language English
publishDate 2025-04-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-8f06fb13b186400683135cd61ed2cf332025-08-20T02:20:06ZengNature PortfolioScientific Reports2045-23222025-04-0115111910.1038/s41598-025-98078-wPoulTrans: a transformer-based model for accurate poultry condition assessmentJun Li0Bing Yang1Junyang Chen2Jiaxin Liu3Felix Kwame Amevor4Guanyu Chen5Buyuan Zhang6Xiaoling Zhao7College of Information Engineering, Sichuan Agricultural UniversityCollege of Information Engineering, Sichuan Agricultural UniversityCollege of Information Engineering, Sichuan Agricultural UniversityCollege of Information Engineering, Sichuan Agricultural UniversityKey Laboratory of Livestock and Poultry Multi-Omics, College of Animal Science and Technology, Sichuan Agricultural UniversityCollege of Information Engineering, Sichuan Agricultural UniversityCollege of Information Engineering, Sichuan Agricultural UniversityKey Laboratory of Livestock and Poultry Multi-Omics, College of Animal Science and Technology, Sichuan Agricultural UniversityAbstract Recent advances in deep learning have significantly enhanced the accuracy of poultry image recognition, particularly in assessing poultry conditions. However, developing intuitive decision support tools remain a significant challenge. To address this, we present PoulTrans, an innovative image captioning framework that leverages a Convolutional Neural Network (CNN) integrated with a CSA_Encoder-Transformer architecture to generate detailed poultry status reports. This model incorporates visual features extracted by CNNs into the Channel Spatial Attention Segmentation Encoder (CSA_Encoder), which produces segmented channel and spatial attention outputs. To optimize multi-level attention and improve the semantic precision of the status descriptions, we introduced a Channel Spatial Memory-Guided Transformer (CSMT) and a novel PS-Loss function. The performance of PoulTrans was tested on the PSC-Captions dataset, achieving top scores of 0.501, 0.803, 4.927, 0.608, and 1.882 for the BLEU-4, ROUGE-L, CIDEr, SPICE, and Sm metrics, respectively. Comprehensive analyses and experiments have validated the effectiveness and reliability of our model, providing advanced tools for automated poultry status generation and enhancing the digital experience for poultry farmers. Our code is available at: https://github.com/kong1107800/PoulTrans .https://doi.org/10.1038/s41598-025-98078-wDeep learningPoultry stateImage captionTransformerPoultrans
spellingShingle Jun Li
Bing Yang
Junyang Chen
Jiaxin Liu
Felix Kwame Amevor
Guanyu Chen
Buyuan Zhang
Xiaoling Zhao
PoulTrans: a transformer-based model for accurate poultry condition assessment
Scientific Reports
Deep learning
Poultry state
Image caption
Transformer
Poultrans
title PoulTrans: a transformer-based model for accurate poultry condition assessment
title_full PoulTrans: a transformer-based model for accurate poultry condition assessment
title_fullStr PoulTrans: a transformer-based model for accurate poultry condition assessment
title_full_unstemmed PoulTrans: a transformer-based model for accurate poultry condition assessment
title_short PoulTrans: a transformer-based model for accurate poultry condition assessment
title_sort poultrans a transformer based model for accurate poultry condition assessment
topic Deep learning
Poultry state
Image caption
Transformer
Poultrans
url https://doi.org/10.1038/s41598-025-98078-w
work_keys_str_mv AT junli poultransatransformerbasedmodelforaccuratepoultryconditionassessment
AT bingyang poultransatransformerbasedmodelforaccuratepoultryconditionassessment
AT junyangchen poultransatransformerbasedmodelforaccuratepoultryconditionassessment
AT jiaxinliu poultransatransformerbasedmodelforaccuratepoultryconditionassessment
AT felixkwameamevor poultransatransformerbasedmodelforaccuratepoultryconditionassessment
AT guanyuchen poultransatransformerbasedmodelforaccuratepoultryconditionassessment
AT buyuanzhang poultransatransformerbasedmodelforaccuratepoultryconditionassessment
AT xiaolingzhao poultransatransformerbasedmodelforaccuratepoultryconditionassessment