Decoding vocal indicators of stress in laying hens: A CNN-MFCC deep learning framework
Artificial intelligence is revolutionizing our capacity to interpret and respond to animal emotional states. This study leverages advanced Convolutional Neural Networks (CNNs) combined with Mel Frequency Cepstral Coefficients (MFCCs) to decode intricate vocalization patterns in laying hens experienc...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-08-01
|
| Series: | Smart Agricultural Technology |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S2772375525002898 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Artificial intelligence is revolutionizing our capacity to interpret and respond to animal emotional states. This study leverages advanced Convolutional Neural Networks (CNNs) combined with Mel Frequency Cepstral Coefficients (MFCCs) to decode intricate vocalization patterns in laying hens experiencing acute environmental stress. Controlled exposure to realistic auditory stimuli (dog barking) and visual stimuli (umbrella opening) across different developmental stages enabled a critical comparative evaluation of vocal stress responses within a commercial-like experimental setup. Over five weeks, audio data were systematically captured from control and treatment groups, providing insights into vocal behaviors before and after stress induction. Remarkably, younger hens demonstrated significantly elevated vocal activity and more pronounced spectral shifts when stressed, underscoring age-dependent variations in emotional reactivity and coping mechanisms. The CNN model attained a remarkable 94 % classification accuracy, reliably discriminating stressor types, age categories, and exposure conditions based solely on MFCC-derived acoustic signatures. Analysis further revealed that lower-order MFCC features are acutely responsive to stress-induced vocal dynamics, whereas higher-order coefficients remained relatively constant, signifying subtle emotional states. These compelling findings position vocalizations as powerful, non-invasive biomarkers of welfare status in poultry, supporting real-time, AI-driven monitoring solutions. By facilitating early, precise detection of distress signals, this innovative approach holds substantial promise for enhancing welfare standards and management decisions in livestock production. Ultimately, this study presents a robust, scalable methodology poised to advance digital agriculture broadly, turning previously silent animal expressions into essential indicators of their wellbeing and transforming farm animal management into a more ethically responsive practice. |
|---|---|
| ISSN: | 2772-3755 |