Rule-Based Multi-Task Deep Learning for Highly Efficient Rice Lodging Segmentation

This study proposes rule-based multi-task deep learning for highly efficient rice lodging identification by introducing prior knowledge to improve the efficiency of disaster investigation using unmanned aerial vehicle (UAV) images. Multi-task learning combines rule-based loss functions and learns th...

Full description

Saved in:
Bibliographic Details
Main Authors: Ming-Der Yang, Hsin-Hung Tseng
Format: Article
Language:English
Published: MDPI AG 2025-04-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/17/9/1505
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study proposes rule-based multi-task deep learning for highly efficient rice lodging identification by introducing prior knowledge to improve the efficiency of disaster investigation using unmanned aerial vehicle (UAV) images. Multi-task learning combines rule-based loss functions and learns the best loss function to train a model conforming to prior knowledge. Rule-based and multi-task learning optimizes the integration of rule-based and deep learning networks and dynamically adjusts the loss function model. Lastly, edge computing is deployed on the edge computing host to improve model efficiency for instant inference. This study inferred fifty-one 4096 × 4096 tagged UAV images taken in 2019 and calculated the confusion matrix and accuracy indices. The recall rate of the modified model in the normal rice category was increased by 13.7%. The affecting factor may be caused by changes in spatial resolution and differences in spectral values in different periods, which can be solved by adding part of the 2019 image transfer training to adjust the learning characteristics. The prior knowledge of a deep learning network can be deployed on edge computing devices to collect high-resolution images by regional routes planning within inferred disaster-damaged farmlands, providing efficient disaster survey tools with high detection accuracy.
ISSN:2072-4292