Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification

Traditional image-centered methods of plant identification could be confused due to various views, uneven illuminations, and growth cycles. To tolerate the significant intraclass variances, the convolutional recurrent neural networks (C-RNNs) are proposed for observation-centered plant identificatio...

Full description

Saved in:
Bibliographic Details
Main Authors: Xuanxin Liu, Fu Xu, Yu Sun, Haiyan Zhang, Zhibo Chen
Format: Article
Language:English
Published: Wiley 2018-01-01
Series:Journal of Electrical and Computer Engineering
Online Access:http://dx.doi.org/10.1155/2018/9373210
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849683682234728448
author Xuanxin Liu
Fu Xu
Yu Sun
Haiyan Zhang
Zhibo Chen
author_facet Xuanxin Liu
Fu Xu
Yu Sun
Haiyan Zhang
Zhibo Chen
author_sort Xuanxin Liu
collection DOAJ
description Traditional image-centered methods of plant identification could be confused due to various views, uneven illuminations, and growth cycles. To tolerate the significant intraclass variances, the convolutional recurrent neural networks (C-RNNs) are proposed for observation-centered plant identification to mimic human behaviors. The C-RNN model is composed of two components: the convolutional neural network (CNN) backbone is used as a feature extractor for images, and the recurrent neural network (RNN) units are built to synthesize multiview features from each image for final prediction. Extensive experiments are conducted to explore the best combination of CNN and RNN. All models are trained end-to-end with 1 to 3 plant images of the same observation by truncated back propagation through time. The experiments demonstrate that the combination of MobileNet and Gated Recurrent Unit (GRU) is the best trade-off of classification accuracy and computational overhead on the Flavia dataset. On the holdout test set, the mean 10-fold accuracy with 1, 2, and 3 input leaves reached 99.53%, 100.00%, and 100.00%, respectively. On the BJFU100 dataset, the C-RNN model achieves the classification rate of 99.65% by two-stage end-to-end training. The observation-centered method based on the C-RNNs shows potential to further improve plant identification accuracy.
format Article
id doaj-art-ced5ad09c2c54086b7f4f8f9fb4ce908
institution DOAJ
issn 2090-0147
2090-0155
language English
publishDate 2018-01-01
publisher Wiley
record_format Article
series Journal of Electrical and Computer Engineering
spelling doaj-art-ced5ad09c2c54086b7f4f8f9fb4ce9082025-08-20T03:23:43ZengWileyJournal of Electrical and Computer Engineering2090-01472090-01552018-01-01201810.1155/2018/93732109373210Convolutional Recurrent Neural Networks for Observation-Centered Plant IdentificationXuanxin Liu0Fu Xu1Yu Sun2Haiyan Zhang3Zhibo Chen4School of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaSchool of Information Science and Technology, Beijing Forestry University, Beijing 100083, ChinaTraditional image-centered methods of plant identification could be confused due to various views, uneven illuminations, and growth cycles. To tolerate the significant intraclass variances, the convolutional recurrent neural networks (C-RNNs) are proposed for observation-centered plant identification to mimic human behaviors. The C-RNN model is composed of two components: the convolutional neural network (CNN) backbone is used as a feature extractor for images, and the recurrent neural network (RNN) units are built to synthesize multiview features from each image for final prediction. Extensive experiments are conducted to explore the best combination of CNN and RNN. All models are trained end-to-end with 1 to 3 plant images of the same observation by truncated back propagation through time. The experiments demonstrate that the combination of MobileNet and Gated Recurrent Unit (GRU) is the best trade-off of classification accuracy and computational overhead on the Flavia dataset. On the holdout test set, the mean 10-fold accuracy with 1, 2, and 3 input leaves reached 99.53%, 100.00%, and 100.00%, respectively. On the BJFU100 dataset, the C-RNN model achieves the classification rate of 99.65% by two-stage end-to-end training. The observation-centered method based on the C-RNNs shows potential to further improve plant identification accuracy.http://dx.doi.org/10.1155/2018/9373210
spellingShingle Xuanxin Liu
Fu Xu
Yu Sun
Haiyan Zhang
Zhibo Chen
Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification
Journal of Electrical and Computer Engineering
title Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification
title_full Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification
title_fullStr Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification
title_full_unstemmed Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification
title_short Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification
title_sort convolutional recurrent neural networks for observation centered plant identification
url http://dx.doi.org/10.1155/2018/9373210
work_keys_str_mv AT xuanxinliu convolutionalrecurrentneuralnetworksforobservationcenteredplantidentification
AT fuxu convolutionalrecurrentneuralnetworksforobservationcenteredplantidentification
AT yusun convolutionalrecurrentneuralnetworksforobservationcenteredplantidentification
AT haiyanzhang convolutionalrecurrentneuralnetworksforobservationcenteredplantidentification
AT zhibochen convolutionalrecurrentneuralnetworksforobservationcenteredplantidentification