[精品论文]A New LocalPCA SOM Algorithm

上传人:pu****.1 文档编号:552739502 上传时间:2023-04-05 格式:DOC 页数:20 大小:1.05MB
返回 下载 相关 举报
[精品论文]A New LocalPCA SOM Algorithm_第1页
第1页 / 共20页
[精品论文]A New LocalPCA SOM Algorithm_第2页
第2页 / 共20页
[精品论文]A New LocalPCA SOM Algorithm_第3页
第3页 / 共20页
[精品论文]A New LocalPCA SOM Algorithm_第4页
第4页 / 共20页
[精品论文]A New LocalPCA SOM Algorithm_第5页
第5页 / 共20页
点击查看更多>>
资源描述

《[精品论文]A New LocalPCA SOM Algorithm》由会员分享,可在线阅读,更多相关《[精品论文]A New LocalPCA SOM Algorithm(20页珍藏版)》请在金锄头文库上搜索。

1、精品论文A New Local PCA SOM AlgorithmDong Huang Zhang Yi and Xiaorong PuComputational Intelligence Laboratory, School of Computer Science andEngineering, University of Electronic Science and Technology of China, Chengdu610054, P. R. China.E-mail: donnyhuang, zhangyi, .AbstractThis paper proposes a Local

2、 PCA-SOM algorithm. The new competition measure is computational efficient, and implicitly incorporates the Mahalanobis distance and the reconstruction error. The matrix inversion or PCA decomposition for each data input is not needed as compared to the previous models. Moreover, the local data dist

3、ribution is completely stored in the covariance matrix instead of the pre-defined numbers of the principal components. Thus, no priori information of the optimal principal subspace is required. Experiments on both the synthesis data and a pattern learning task are carried out to show the performance

4、 of the proposed method.Key words: Neural Networks; Unsupervised Learning; Local Principal ComponentAnalysis; Self-Organizing Mapping1 IntroductionPrincipal component analysis (PCA) has been wildly used in dimension re- duction of multi-variate data, data compression, pattern recognition and sta- ti

5、stical analysis. A PCA algorithm is designed to approximate the original high-dimensional pattern space by a low-dimensional subspace spanned by the principal eigenvectors of the data covariance matrix. In this way, the data distribution can be represented and reconstructed by the principal eigenvec

6、- tors and their corresponding eigenvalues. However, PCA is globally linear and1 This work was supported by National Science Foundation of China under Grant60471055 and Specialized Research Fund for the Doctoral Program of Higher Edu- cation under Grant 20040614017.Preprint submitted to Elsevier Sci

7、ence 28 September 20071inefficient for data distribution with non-linear dependencies 1. Several ex- tensions have been suggested to overcome this problem. These extensions fall into two main categories: the global nonlinear approach and locally linear de- scriptions. The former includes the Princip

8、al Curves 2 and Kernel PCA 3. We focus on the latter approaches in which the data distribution is represented by a collection of Local PCA units 78.When describing a data distribution with a mixture model, the question arises what kind of units should the mixture contain. In Gaussian mixture models

9、7, the local iso-density surface of a uniform Gaussian unit is a sphere. And a local PCA unit corresponds to a multivariate Gaussian with an ellipsoid iso-density surface. Despite its greater complexity, the local PCA is favorable over a sphere unit for the following reasons. An ellipsoid can descri

10、be a local structure for which many spheres are needed. Furthermore, data distributions are usually constrained locally to subspaces with fewer dimensions than the space of the training data. Thus, there are directions in which the distribution has locally zero variance (or almost zero because of no

11、ise). An ellipsoid unit representation can extend its minor components into the additional noise dimensions. But the computational cost increases over-proportionally with the number of principal components needed.In the recent work NGAS-PCA 4, the local PCA model is based on soft competition scheme

12、of Neural Gas algorithm and the RRLSA 10 online PCA learning in each local unit. The novelty of the NGAS-PCA is that its dis- tance measure for neuron competition is the combination of a normalized Mahalanobis distance and the squared reconstruction error. Note that storing the principal basis vecto

13、rs for every unit involves rather complicated updating procedures as in NGAS-PCA and ASSOM 9. Another recent model is called the PCA-SOM 5 which proposes an alternative way by storing the local in- formation in the covariance matrix. This is directly derived from on statistics theories and has great

14、 advantages over the ASSOM model both in computa- tion burden and reliability of the result. However, PCA-SOM only uses the reconstruction error in the principal subspace in its neuron competition. For this reason, PCA-SOM still need the PCA decomposition or updating with predefined numbers of the p

15、rincipal components when each training data is presented.In this paper, we propose a new computational efficient Local PCA algorithm that combines the advantages of NGAS-PCA and PCA-SOM. Each unit is associated with its mean vector and covariance matrix. The new competition measure implicitly incorporate the reconstruction error and distance between the input data and the unit center. In the proposed algorithm, the extra up- dating step of the principal s

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 建筑/环境 > 施工组织

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号