Model reduction of a large scale system using PCA technique

Küçük Resim Yok

Tarih

2009

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Maltepe Üniversitesi

Erişim Hakkı

CC0 1.0 Universal
info:eu-repo/semantics/openAccess

Araştırma projeleri

Organizasyon Birimleri

Dergi sayısı

Özet

Most of the model reduction techniques proposed in the literature are based on the use of multivariate statistical techniques. The linear Principal Component Analysis (PCA) is one of the most known methods in data analysis. It looks for one subspace of a smaller dimension than the initial space and projects the studied data into this space with a minimum loss of information. Therefore, the obtained result is a representation of data with a reduction of dimension. To reduce calculations, in the case where the correlation matrix is large, the neural network of the PCA has been proposed. In general, neural network approaches in PCA distinguish themselves through two criteria of optimised training that are equivalent: variances maximization of data projection and quadratic error minimization of estimated data. Most approaches which use networks of multi-layer perceptron for obtaining the non-linear PCA model (NLPCA) encounter problems of optimization often non-linear such as the headache of convergence and initialization of this network type. For this reason, while combining the main curves and the Radial Basis Function Neural Networks, we propose an approach for the NLPCA with two networks of three cascading layers. The problem of training presents a linear regression in relation with the output layer weights. The algorithm which determines the number of nonlinear components to be retained in the NLPCA model is based on the accumulate variance.

Açıklama

Anahtar Kelimeler

Kaynak

International Conference of Mathematical Sciences

WoS Q Değeri

Scopus Q Değeri

Cilt

Sayı

Künye

Kouadri, A., Zelmat, M. ve Namoune, A. (2009). Model reduction of a large scale system using PCA technique. Maltepe Üniversitesi, İnsan ve Toplum Bilimleri Fakültesi. s. 62.