Intrinsic Grassmann Averages for Online Linear, Robust and Nonlinear Subspace Learning.


Journal

IEEE transactions on pattern analysis and machine intelligence
ISSN: 1939-3539
Titre abrégé: IEEE Trans Pattern Anal Mach Intell
Pays: United States
ID NLM: 9885960

Informations de publication

Date de publication:
Nov 2021
Historique:
pubmed: 10 5 2020
medline: 10 5 2020
entrez: 10 5 2020
Statut: ppublish

Résumé

Principal component analysis (PCA) and Kernel principal component analysis (KPCA) are fundamental methods in machine learning for dimensionality reduction. The former is a technique for finding this approximation in finite dimensions and the latter is often in an infinite dimensional reproducing Kernel Hilbert-space (RKHS). In this paper, we present a geometric framework for computing the principal linear subspaces in both (finite and infinite) situations as well as for the robust PCA case, that amounts to computing the intrinsic average on the space of all subspaces: the Grassmann manifold. Points on this manifold are defined as the subspaces spanned by K-tuples of observations. The intrinsic Grassmann average of these subspaces are shown to coincide with the principal components of the observations when they are drawn from a Gaussian distribution. We show similar results in the RKHS case and provide an efficient algorithm for computing the projection onto the this average subspace. The result is a method akin to KPCA which is substantially faster. Further, we present a novel online version of the KPCA using our geometric framework. Competitive performance of all our algorithms are demonstrated on a variety of real and synthetic data sets.

Identifiants

pubmed: 32386140
doi: 10.1109/TPAMI.2020.2992392
doi:

Types de publication

Journal Article

Langues

eng

Sous-ensembles de citation

IM

Pagination

3904-3917

Auteurs

Classifications MeSH