A fast algorithm for updating and downsizing the dominant kernel principal components
Academic Article
Publication Date:
2010
abstract:
Many important kernel methods in the machine learning area, such as kernel principal
component analysis, feature approximation, denoising, compression and prediction require the computation of the dominant set of eigenvectors of the symmetric kernel Gram matrix.
Recently, an efficient incremental approach was presented for the fast calculation of the dominant kernel eigenbasis.
In this manuscript we propose faster algorithms for incrementally updating and downsizing the dominant kernel eigenbasis. These methods are well-suited for large scale problems since they are both efficient in terms of complexity and data management.
Iris type:
01.01 Articolo in rivista
Keywords:
Dominant eigenvalues; Updating; Kernel Gram matrix; Principal components; Large scale data
List of contributors:
Mastronardi, Nicola
Published in: