File(s) under permanent embargo
A globally convergent MC algorithm with an adaptive learning rate
journal contribution
posted on 2012-01-01, 00:00 authored by D Peng, Z Yi, Yong XiangYong Xiang, H ZhangThis brief deals with the problem of minor component analysis (MCA). Artificial neural networks can be exploited to achieve the task of MCA. Recent research works show that convergence of neural networks based MCA algorithms can be guaranteed if the learning rates are less than certain thresholds. However, the computation of these thresholds needs information about the eigenvalues of the autocorrelation matrix of data set, which is unavailable in online extraction of minor component from input data stream. In this correspondence, we introduce an adaptive learning rate into the OJAn MCA algorithm, such that its convergence condition does not depend on any unobtainable information, and can be easily satisfied in practical applications.
History
Journal
IEEE transactions on neural networks and learning systemsVolume
23Issue
2Pagination
359 - 365Publisher
IEEELocation
Piscataway, N. J.ISSN
2162-237XeISSN
2162-2388Language
engPublication classification
C1 Refereed article in a scholarly journalCopyright notice
2012, IEEEUsage metrics
Categories
No categories selectedKeywords
deterministic discrete time systemeigenvalueeigenvectorminor component analysisneural networksScience & TechnologyTechnologyComputer Science, Artificial IntelligenceComputer Science, Hardware & ArchitectureComputer Science, Theory & MethodsEngineering, Electrical & ElectronicComputer ScienceEngineeringDISCRETE-TIME DYNAMICSEXTRACTIONSUBSPACEPCAPRINCIPAL
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC