Adaptive dimensionality reduction for neural network-based online principal component analysis.

"Principal Component Analysis" (PCA) is an established linear technique for dimensionality reduction. It performs an orthonormal transformation to replace possibly correlated variables with a smaller set of linearly independent variables, the so-called principal components, which capture a...

Full description

Saved in:
Bibliographic Details
Main Authors: Nico Migenda, Ralf Möller, Wolfram Schenck
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2021-01-01
Series:PLoS ONE
Online Access:https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0248896&type=printable
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850181843250315264
author Nico Migenda
Ralf Möller
Wolfram Schenck
author_facet Nico Migenda
Ralf Möller
Wolfram Schenck
author_sort Nico Migenda
collection DOAJ
description "Principal Component Analysis" (PCA) is an established linear technique for dimensionality reduction. It performs an orthonormal transformation to replace possibly correlated variables with a smaller set of linearly independent variables, the so-called principal components, which capture a large portion of the data variance. The problem of finding the optimal number of principal components has been widely studied for offline PCA. However, when working with streaming data, the optimal number changes continuously. This requires to update both the principal components and the dimensionality in every timestep. While the continuous update of the principal components is widely studied, the available algorithms for dimensionality adjustment are limited to an increment of one in neural network-based and incremental PCA. Therefore, existing approaches cannot account for abrupt changes in the presented data. The contribution of this work is to enable in neural network-based PCA the continuous dimensionality adjustment by an arbitrary number without the necessity to learn all principal components. A novel algorithm is presented that utilizes several PCA characteristics to adaptivly update the optimal number of principal components for neural network-based PCA. A precise estimation of the required dimensionality reduces the computational effort while ensuring that the desired amount of variance is kept. The computational complexity of the proposed algorithm is investigated and it is benchmarked in an experimental study against other neural network-based and incremental PCA approaches where it produces highly competitive results.
format Article
id doaj-art-95e4150aae6943e6ab483738bcbfc3c9
institution OA Journals
issn 1932-6203
language English
publishDate 2021-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj-art-95e4150aae6943e6ab483738bcbfc3c92025-08-20T02:17:49ZengPublic Library of Science (PLoS)PLoS ONE1932-62032021-01-01163e024889610.1371/journal.pone.0248896Adaptive dimensionality reduction for neural network-based online principal component analysis.Nico MigendaRalf MöllerWolfram Schenck"Principal Component Analysis" (PCA) is an established linear technique for dimensionality reduction. It performs an orthonormal transformation to replace possibly correlated variables with a smaller set of linearly independent variables, the so-called principal components, which capture a large portion of the data variance. The problem of finding the optimal number of principal components has been widely studied for offline PCA. However, when working with streaming data, the optimal number changes continuously. This requires to update both the principal components and the dimensionality in every timestep. While the continuous update of the principal components is widely studied, the available algorithms for dimensionality adjustment are limited to an increment of one in neural network-based and incremental PCA. Therefore, existing approaches cannot account for abrupt changes in the presented data. The contribution of this work is to enable in neural network-based PCA the continuous dimensionality adjustment by an arbitrary number without the necessity to learn all principal components. A novel algorithm is presented that utilizes several PCA characteristics to adaptivly update the optimal number of principal components for neural network-based PCA. A precise estimation of the required dimensionality reduces the computational effort while ensuring that the desired amount of variance is kept. The computational complexity of the proposed algorithm is investigated and it is benchmarked in an experimental study against other neural network-based and incremental PCA approaches where it produces highly competitive results.https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0248896&type=printable
spellingShingle Nico Migenda
Ralf Möller
Wolfram Schenck
Adaptive dimensionality reduction for neural network-based online principal component analysis.
PLoS ONE
title Adaptive dimensionality reduction for neural network-based online principal component analysis.
title_full Adaptive dimensionality reduction for neural network-based online principal component analysis.
title_fullStr Adaptive dimensionality reduction for neural network-based online principal component analysis.
title_full_unstemmed Adaptive dimensionality reduction for neural network-based online principal component analysis.
title_short Adaptive dimensionality reduction for neural network-based online principal component analysis.
title_sort adaptive dimensionality reduction for neural network based online principal component analysis
url https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0248896&type=printable
work_keys_str_mv AT nicomigenda adaptivedimensionalityreductionforneuralnetworkbasedonlineprincipalcomponentanalysis
AT ralfmoller adaptivedimensionalityreductionforneuralnetworkbasedonlineprincipalcomponentanalysis
AT wolframschenck adaptivedimensionalityreductionforneuralnetworkbasedonlineprincipalcomponentanalysis