Classification of reduction invariants with improved backpropagation

Data reduction is a process of feature extraction that transforms the data space into a feature space of much lower dimension compared to the original data space, yet it retains most of the intrinsic information content of the data. This can be done by using a number of methods, such as principal co...

Full description

Saved in:
Bibliographic Details
Main Authors: S. M. Shamsuddin, M. Darus, M. N. Sulaiman
Format: Article
Language:English
Published: Wiley 2002-01-01
Series:International Journal of Mathematics and Mathematical Sciences
Online Access:http://dx.doi.org/10.1155/S0161171202006117
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Data reduction is a process of feature extraction that transforms the data space into a feature space of much lower dimension compared to the original data space, yet it retains most of the intrinsic information content of the data. This can be done by using a number of methods, such as principal component analysis (PCA), factor analysis, and feature clustering. Principal components are extracted from a collection of multivariate cases as a way of accounting for as much of the variation in that collection as possible by means of as few variables as possible. On the other hand, backpropagation network has been used extensively in classification problems such as XOR problems, share prices prediction, and pattern recognition. This paper proposes an improved error signal of backpropagation network for classification of the reduction invariants using principal component analysis, for extracting the bulk of the useful information present in moment invariants of handwritten digits, leaving the redundant information behind. Higher order centralised scale- invariants are used to extract features of handwritten digits before PCA, and the reduction invariants are sent to the improved backpropagation model for classification purposes.
ISSN:0161-1712
1687-0425