1. Run method eigenfaces.m! a) The method transforms the originally 1024(32x32)-dimensional points into the 100 dimensional space on default. What happens if you change the dimensionality of the transformed points! b) Add some uniform noise from the [-28,28] interval to the data and perform PCA again! 2. Load in the data from outliers.mat. a) Perform PCA dimensionality reduction on the data! b) Perform singular value decomposition (SVD) using the built-in function of Octave. c) Now perform SVD without relying on the built-in function of Octave. What size matrices U, Sigma and V will have? What would be the result of the multiplication U*S*V^T? d) Omit (i.e. set to 0) the least significant (i.e. having the least singular value) latent topic from the reconstruction! What do you experience and what explanations can you give for it? e) What would be the result if we discard (i.e. set it to 0) the top 2 singular values? What is going to be the relation between sum(sum(X.^2)) and norm(U*S*V'-X, 'fro')? f) Come up with a formula that calulates the Frobenius norm of some matrix X in a vectorized manner! 3. Perform the method lda.m. Analyze the source code and modify what is our optimization objective (i.e. the amount of between and/or within class scatter).