Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification Model

Improvements in hyperspectral image technology, diversification methods, and cost reductions have increased the convenience of hyperspectral data acquisitions. However, because of their multiband and multiredundant characteristics, hyperspectral data processing is still complex. Two feature extracti...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiaoai Dai, Junying Cheng, Shouheng Guo, Chengchen Wang, Ge Qu, Wenxin Liu, Weile Li, Heng Lu, Youlin Wang, Binyang Zeng, Yunjie Peng, Shuneng Liang
Format: Article
Language:English
Published: Wiley 2023-01-01
Series:Discrete Dynamics in Nature and Society
Online Access:http://dx.doi.org/10.1155/2023/9150482
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850221946206158848
author Xiaoai Dai
Junying Cheng
Shouheng Guo
Chengchen Wang
Ge Qu
Wenxin Liu
Weile Li
Heng Lu
Youlin Wang
Binyang Zeng
Yunjie Peng
Shuneng Liang
author_facet Xiaoai Dai
Junying Cheng
Shouheng Guo
Chengchen Wang
Ge Qu
Wenxin Liu
Weile Li
Heng Lu
Youlin Wang
Binyang Zeng
Yunjie Peng
Shuneng Liang
author_sort Xiaoai Dai
collection DOAJ
description Improvements in hyperspectral image technology, diversification methods, and cost reductions have increased the convenience of hyperspectral data acquisitions. However, because of their multiband and multiredundant characteristics, hyperspectral data processing is still complex. Two feature extraction algorithms, the autoencoder (AE) and restricted Boltzmann machine (RBM), were used to optimize the classification model parameters. The optimal classification model was obtained by comparing a stacked autoencoder (SAE) and a deep belief network (DBN). Finally, the SAE was further optimized by adding sparse representation constraints and GPU parallel computation to improve classification accuracy and speed. The research results show that the SAE enhanced by deep learning is superior to the traditional feature extraction algorithm. The optimal classification model based on deep learning, namely, the stacked sparse autoencoder, achieved 93.41% and 94.92% classification accuracy using two experimental datasets. The use of parallel computing increased the model’s training speed by more than seven times, solving the model’s lengthy training time limitation.
format Article
id doaj-art-71c0bb7c7e5e435bbe38d9f526122872
institution OA Journals
issn 1607-887X
language English
publishDate 2023-01-01
publisher Wiley
record_format Article
series Discrete Dynamics in Nature and Society
spelling doaj-art-71c0bb7c7e5e435bbe38d9f5261228722025-08-20T02:06:31ZengWileyDiscrete Dynamics in Nature and Society1607-887X2023-01-01202310.1155/2023/9150482Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification ModelXiaoai Dai0Junying Cheng1Shouheng Guo2Chengchen Wang3Ge Qu4Wenxin Liu5Weile Li6Heng Lu7Youlin Wang8Binyang Zeng9Yunjie Peng10Shuneng Liang11School of Earth Science Chengdu University of TechnologySchool of Earth Science Chengdu University of TechnologySchool of Earth Science Chengdu University of TechnologySchool of Earth Science Chengdu University of TechnologySchool of Earth Science Chengdu University of TechnologySchool of Earth Science Chengdu University of TechnologyState Key Laboratory of Geohazard Prevention and Geoenvironment ProtectionCollege of Hydraulic and Hydroelectric EngineeringNorthwest Engineering Corporation LimitedSouthwest Branch of China Petroleum Engineering Construction Co. LtdGEOVIS Wisdom Technology Co. LtdLand Satellite Remote Sensing Application CenterImprovements in hyperspectral image technology, diversification methods, and cost reductions have increased the convenience of hyperspectral data acquisitions. However, because of their multiband and multiredundant characteristics, hyperspectral data processing is still complex. Two feature extraction algorithms, the autoencoder (AE) and restricted Boltzmann machine (RBM), were used to optimize the classification model parameters. The optimal classification model was obtained by comparing a stacked autoencoder (SAE) and a deep belief network (DBN). Finally, the SAE was further optimized by adding sparse representation constraints and GPU parallel computation to improve classification accuracy and speed. The research results show that the SAE enhanced by deep learning is superior to the traditional feature extraction algorithm. The optimal classification model based on deep learning, namely, the stacked sparse autoencoder, achieved 93.41% and 94.92% classification accuracy using two experimental datasets. The use of parallel computing increased the model’s training speed by more than seven times, solving the model’s lengthy training time limitation.http://dx.doi.org/10.1155/2023/9150482
spellingShingle Xiaoai Dai
Junying Cheng
Shouheng Guo
Chengchen Wang
Ge Qu
Wenxin Liu
Weile Li
Heng Lu
Youlin Wang
Binyang Zeng
Yunjie Peng
Shuneng Liang
Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification Model
Discrete Dynamics in Nature and Society
title Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification Model
title_full Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification Model
title_fullStr Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification Model
title_full_unstemmed Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification Model
title_short Optimization Strategy of a Stacked Autoencoder and Deep Belief Network in a Hyperspectral Remote-Sensing Image Classification Model
title_sort optimization strategy of a stacked autoencoder and deep belief network in a hyperspectral remote sensing image classification model
url http://dx.doi.org/10.1155/2023/9150482
work_keys_str_mv AT xiaoaidai optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT junyingcheng optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT shouhengguo optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT chengchenwang optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT gequ optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT wenxinliu optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT weileli optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT henglu optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT youlinwang optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT binyangzeng optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT yunjiepeng optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel
AT shunengliang optimizationstrategyofastackedautoencoderanddeepbeliefnetworkinahyperspectralremotesensingimageclassificationmodel