PCA-LDA-Case-Studies--PCA-LDA学习

上传人:206****923 文档编号:88626765 上传时间:2019-05-05 格式:PPT 页数:56 大小:7.15MB
返回 下载 相关 举报
PCA-LDA-Case-Studies--PCA-LDA学习_第1页
第1页 / 共56页
PCA-LDA-Case-Studies--PCA-LDA学习_第2页
第2页 / 共56页
PCA-LDA-Case-Studies--PCA-LDA学习_第3页
第3页 / 共56页
PCA-LDA-Case-Studies--PCA-LDA学习_第4页
第4页 / 共56页
PCA-LDA-Case-Studies--PCA-LDA学习_第5页
第5页 / 共56页
点击查看更多>>
资源描述

《PCA-LDA-Case-Studies--PCA-LDA学习》由会员分享,可在线阅读,更多相关《PCA-LDA-Case-Studies--PCA-LDA学习(56页珍藏版)》请在金锄头文库上搜索。

1、CS 479/679 Pattern Recognition Spring 2006 Dimensionality Reduction Using PCA/LDA Chapter 3 (Duda et al.) Section 3.8,Case Studies: Face Recognition Using Dimensionality Reduction M. Turk, A. Pentland, “Eigenfaces for Recognition“, Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. D. Swets,

2、J. Weng, “Using Discriminant Eigenfeatures for Image Retrieval“, IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(8), pp. 831-836, 1996. A. Martinez, A. Kak, “PCA versus LDA“, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 2, pp. 228-233, 2001.,2,Dimen

3、sionality Reduction,One approach to deal with high dimensional data is by reducing their dimensionality. Project high dimensional data onto a lower dimensional sub-space using linear or non-linear transformations.,3,Dimensionality Reduction,Linear transformations are simple to compute and tractable.

4、 Classical linear- approaches: Principal Component Analysis (PCA) Fisher Discriminant Analysis (FDA),k x 1 k x d d x 1 (kd),4,Principal Component Analysis (PCA),Each dimensionality reduction technique finds an appropriate transformation by satisfying certain criteria (e.g., information loss, data di

5、scrimination, etc.) The goal of PCA is to reduce the dimensionality of the data while retaining as much as possible of the variation present in the dataset.,5,Principal Component Analysis (PCA),Find a basis in a low dimensional sub-space:,Approximate vectors by projecting them in a low dimensional s

6、ub-space:,(1) Original space representation:,(2) Lower-dimensional sub-space representation:,Note: if K=N, then,6,Principal Component Analysis (PCA),Example (K=N):,7,Principal Component Analysis (PCA),Information loss,Dimensionality reduction implies information loss ! PCA preserves as much informat

7、ion as possible:,What is the “best” lower dimensional sub-space? The “best” low-dimensional space is centered at the sample mean and has directions determined by the “best” eigenvectors of the covariance matrix of the data x. By “best” eigenvectors we mean those corresponding to the largest eigenval

8、ues ( i.e., “principal components”). Since the covariance matrix is real and symmetric, these eigenvectors are orthogonal and form a set of basis vectors.,(see pp. 114-117 in textbook for a proof),8,Principal Component Analysis (PCA),Methodology,Suppose x1, x2, ., xM are N x 1 vectors,9,Principal Co

9、mponent Analysis (PCA),Methodology cont.,10,Principal Component Analysis (PCA),Eigenvalue spectrum,i,K,N,11,Principal Component Analysis (PCA),Linear transformation implied by PCA,The linear transformation RN RK that performs the dimensionality reduction is:,12,Principal Component Analysis (PCA),Geo

10、metric interpretation,PCA projects the data along the directions where the data varies the most. These directions are determined by the eigenvectors of the covariance matrix corresponding to the largest eigenvalues. The magnitude of the eigenvalues corresponds to the variance of the data along the e

11、igenvector directions.,13,Principal Component Analysis (PCA),How many principal components to keep?,To choose K, you can use the following criterion:,14,Principal Component Analysis (PCA),What is the error due to dimensionality reduction?,It can be shown that the average error due to dimensionality

12、reduction is equal to:,15,Principal Component Analysis (PCA),Standardization,The principal components are dependent on the units used to measure the original variables as well as on the range of values they assume. We should always standardize the data prior to using PCA. A common standardization me

13、thod is to transform all the data to have zero mean and unit standard deviation:,16,Principal Component Analysis (PCA),Case Study: Eigenfaces for Face Detection/Recognition,M. Turk, A. Pentland, “Eigenfaces for Recognition“, Journal of Cognitive Neuroscience, vol. 3, no. 1, pp. 71-86, 1991.,Face Rec

14、ognition,The simplest approach is to think of it as a template matching problem,Problems arise when performing recognition in a high-dimensional space. Significant improvements can be achieved by first mapping the data into a lower dimensionality space. How to find this lower-dimensional space?,17,P

15、rincipal Component Analysis (PCA),Main idea behind eigenfaces,average face,18,Principal Component Analysis (PCA),Computation of the eigenfaces,19,Principal Component Analysis (PCA),Computation of the eigenfaces cont.,20,Principal Component Analysis (PCA),Computation of the eigenfaces cont.,ui,21,Pri

16、ncipal Component Analysis (PCA),Computation of the eigenfaces cont.,22,Principal Component Analysis (PCA),Representing faces onto this basis,23,Principal Component Analysis (PCA),Eigenvalue spectrum,i,K,M,N,24,Principal Component Analysis (PCA),Representing faces onto this basis cont.,25,Principal Component Analysis (PCA),Face Recognition Using Eigenfaces,26,Principal Component Analysis (PCA),Face Recognition Using Eigenfaces cont.,The distance er is

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 中学教育 > 其它中学文档

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号