In this paper, we propose an efficient method for handling large datasets in linear parameter-varying (LPV) model identification. The method is based on least-squares support vector machine (LS-SVM) identification in the primal space. To make the identification computationally feasible, even for very large datasets, we propose estimating a finite-dimensional feature map. To achieve this, we propose a two-step method to reduce the computational effort. First, we define the training set as a fixed-size subsample of the entire dataset, considering collision entropy for subset selection. The second step involves approximating the feature map through the eigenvalue decomposition of the kernel matrices. This paper considers both autoregressive with exogenous input (ARX) and state-space (SS) model forms. By comparing the problem formulation in the primal and dual spaces in terms of accuracy and computational complexity, the main advantage of the proposed technique is the reduction in space and time complexity during the training stage, making it preferable for handling very large datasets. To validate our proposed primal approach, we apply it to estimate LPV models using provided inputs, outputs, and scheduling signals for two nonlinear benchmarks: the parallel Wiener-Hammerstein system and the Silverbox system. The performances of our proposed approach are compared with the dual LS-SVM approach and the kernel principal component regression.

Fixed-size LS-SVM LPV System Identification for Large Datasets / Cavanini, Luca; Felicetti, Riccardo; Ferracuti, Francesco; Monteriu', Andrea. - In: INTERNATIONAL JOURNAL OF CONTROL, AUTOMATION, AND SYSTEMS. - ISSN 1598-6446. - 21:12(2023), pp. 4067-4079. [10.1007/s12555-023-0062-y]

Fixed-size LS-SVM LPV System Identification for Large Datasets

Cavanini, Luca;Felicetti, Riccardo;Ferracuti, Francesco
;
Monteriu', Andrea
2023-01-01

Abstract

In this paper, we propose an efficient method for handling large datasets in linear parameter-varying (LPV) model identification. The method is based on least-squares support vector machine (LS-SVM) identification in the primal space. To make the identification computationally feasible, even for very large datasets, we propose estimating a finite-dimensional feature map. To achieve this, we propose a two-step method to reduce the computational effort. First, we define the training set as a fixed-size subsample of the entire dataset, considering collision entropy for subset selection. The second step involves approximating the feature map through the eigenvalue decomposition of the kernel matrices. This paper considers both autoregressive with exogenous input (ARX) and state-space (SS) model forms. By comparing the problem formulation in the primal and dual spaces in terms of accuracy and computational complexity, the main advantage of the proposed technique is the reduction in space and time complexity during the training stage, making it preferable for handling very large datasets. To validate our proposed primal approach, we apply it to estimate LPV models using provided inputs, outputs, and scheduling signals for two nonlinear benchmarks: the parallel Wiener-Hammerstein system and the Silverbox system. The performances of our proposed approach are compared with the dual LS-SVM approach and the kernel principal component regression.
File in questo prodotto:
File Dimensione Formato  
Cavanini_Fixed-size LS_2023.pdf

Solo gestori archivio

Licenza d'uso: Tutti i diritti riservati
Dimensione 1.1 MB
Formato Adobe PDF
1.1 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/323971
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact