Grassmann manifold based sparse spectral clustering is a classification technique that consists in learning a latent representation of data, formed by a subspace basis, which is sparse. In order to learn a latent representation, spectral clustering is formulated in terms of a loss minimization problem over a smooth manifold known as Grassmannian. Such minimization problem cannot be tackled by one of traditional gradient-based learning algorithms, which are only suitable to perform optimization in absence of constraints among parameters. It is, therefore, necessary to develop specific optimization/learning algorithms that are able to look for a local minimum of a loss function under smooth constraints in an efficient way. Such need calls for manifold optimization methods. In this paper, we extend classical gradient-based learning algorithms on flat parameter spaces (from classical gradient descent to adaptive momentum) to curved spaces (smooth manifolds) by means of tools from manifold calculus. We compare clustering performances of these methods and known methods from the scientific literature. The obtained results confirm that the proposed learning algorithms prove lighter in computational complexity than existing ones without detriment in clustering efficacy.

Gradient-based learning methods extended to smooth manifolds applied to automated clustering

Koudounas A.
Software
;
Fiori S.
Conceptualization
2020

Abstract

Grassmann manifold based sparse spectral clustering is a classification technique that consists in learning a latent representation of data, formed by a subspace basis, which is sparse. In order to learn a latent representation, spectral clustering is formulated in terms of a loss minimization problem over a smooth manifold known as Grassmannian. Such minimization problem cannot be tackled by one of traditional gradient-based learning algorithms, which are only suitable to perform optimization in absence of constraints among parameters. It is, therefore, necessary to develop specific optimization/learning algorithms that are able to look for a local minimum of a loss function under smooth constraints in an efficient way. Such need calls for manifold optimization methods. In this paper, we extend classical gradient-based learning algorithms on flat parameter spaces (from classical gradient descent to adaptive momentum) to curved spaces (smooth manifolds) by means of tools from manifold calculus. We compare clustering performances of these methods and known methods from the scientific literature. The obtained results confirm that the proposed learning algorithms prove lighter in computational complexity than existing ones without detriment in clustering efficacy.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/286582
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact