A Smartphone-Based Non-Destructive Multimodal Deep Learning Approach Using pH-Sensitive Pitaya Peel Films for Real-Time Fish Freshness Detection

The detection of fish freshness is crucial for ensuring food safety. This study addresses the limitations of traditional detection methods, which rely on laboratory equipment and complex procedures, by proposing a smartphone-based detection method, termed FreshFusionNet, that utilizes a pitaya peel...

Full description

Saved in:
Bibliographic Details
Main Authors: Yixuan Pan, Yujie Wang, Yuzhe Zhou, Jiacheng Zhou, Manxi Chen, Dongling Liu, Feier Li, Can Liu, Mingwan Zeng, Dongjing Jiang, Xiangyang Yuan, Hejun Wu
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Foods
Subjects:
Online Access:https://www.mdpi.com/2304-8158/14/10/1805
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The detection of fish freshness is crucial for ensuring food safety. This study addresses the limitations of traditional detection methods, which rely on laboratory equipment and complex procedures, by proposing a smartphone-based detection method, termed FreshFusionNet, that utilizes a pitaya peel pH intelligent indicator film in conjunction with multimodal deep learning. The pitaya peel indicator film, prepared using high-pressure homogenization technology, demonstrates a significant color change from dark red to yellow in response to the volatile alkaline substances released during fish spoilage. To construct a multimodal dataset, 3600 images of the indicator film were captured using a smartphone under various conditions (natural light and indoor light) and from multiple angles (0° to 120°), while simultaneously recording pH values, total volatile basic nitrogen (TVB-N), and total viable count (TVC) data. Based on the lightweight MobileNetV2 network, a Multi-scale Dilated Fusion Attention module (MDFA) was designed to enhance the robustness of color feature extraction. A Temporal Convolutional Network (TCN) was then used to model dynamic patterns in chemical indicators across spoilage stages, combined with a Context-Aware Gated Fusion (CAG-Fusion) mechanism to adaptively integrate image and chemical temporal features. Experimental results indicate that the overall classification accuracy of FreshFusionNet reaches 99.61%, with a single inference time of only 142 ± 40 milliseconds (tested on Xiaomi 14). This method eliminates the need for professional equipment and enables real-time, non-destructive detection of fish spoilage through smartphones, providing consumers and the food supply chain with a low-cost, portable quality-monitoring tool, thereby promoting the intelligent and universal development of food safety detection technology.
ISSN:2304-8158