`
Miguel Á. Carreira-Perpiñán
Associate professor
Electrical Engineering and Computer Science
School of Engineering
University of California, Merced
mcarreira-perpinan-[at]-ucmerced.edu; 209-2284545
Office: 284, Science & Engineering Building
`

Office hours: by appointment (call or email, including [EECS282] in the subject).

Lectures: Tuesdays/Thursdays 10:30-11:45am (Classroom Building 274)

Lab class: Mondays 10am-12:50pm (Linux Lab, SE138)

Course web page: `http://faculty.ucmerced.edu/mcarreira-perpinan/teaching/EECS282`

The course reviews advanced topics in machine learning. Machine learning is the study of models and algorithms that learn information from data. Machine learning ideas underlie many algorithms in computer vision, speech processing, bioinformatics, robotics, computer graphics and other areas. The 2010 edition of the course will focus on **dimensionality reduction and manifold learning** and extend the contents of the 2008 edition.

Prerequisites: the course is intended for graduate students who have taken an introductory course in machine learning (such as EECS276).

There is no required textbook. Selected readings will appear in this web page in due course. The following are two reviews of dimensionality reduction and manifold learning techniques:

- M. Á. Carreira-Perpiñán (2001):
*Continuous latent variable models for dimensionality reduction and sequential data reconstruction*. PhD thesis, University of Sheffield, UK.- Chapter 2:
*The continuous latent variable modelling formalism*.

This contains a review of continuous latent variable models: probabilistic principal component analysis (PCA), factor analysis, the generative topographic mapping (GTM), independent component analysis (ICA), mixtures of latent variable models, etc. It also deals with issues such as parameter estimation, identifiability, interpretability, visualisation, and dimensionality reduction with continuous latent variable models. - Chapter 4:
*Dimensionality reduction*.

This contains a review of dimensionality reduction with nonprobabilistic methods (probabilistic methods, i.e., latent variable models, are reviewed in chapter 2): nonlinear autoassociators, kernel PCA, principal curves, vector quantisation, multidimensional scaling, Isomap, LLE, etc. It also reviews issues such as the curse of dimensionality and the intrinsic dimensionality.

- Chapter 2:
- L. K. Saul, K. Q. Weinberger, J. H. Ham, F. Sha and D. D. Lee (2006): "Spectral methods for dimensionality reduction", In
*Semi-Supervised Learning*(O. Chapelle, B. Schölkopf and A. Zien, eds.), MIT Press, pp. 293-308.

Other books on general machine learning:

- Christopher M. Bishop:
*Pattern Recognition and Machine Learning*. Springer, 2006. - David J. C. MacKay:
*Information Theory, Inference and Learning Algorithms*. Cambridge University Press, 2003. - Bernhard Schölkopf and Alexander J. Smola:
*Learning with Kernels*. MIT Press, 2001. - Trevor J. Hastie, Robert J. Tibshirani and Jerome H. Friedman:
*The Elements of Statistical Learning*. Springer, 2001. - Richard O. Duda, Peter E. Hart and David G. Stork:
*Pattern Classification*, second ed. Wiley, 2001. - Aapo Hyvärinen, Juha Karhunen and Erkki Oja:
*Independent Component Analysis*. Wiley, 2001.

- Nonlinear methods based on pairwise distances (Oct. 7):
- Hinton and Roweis: "Stochastic Neighbor Embedding". NIPS, 2002. Code.
- van der Maaten and Hinton: "
*t*-distributed Stochastic Neighbor Embedding". JMLR, 2008. Code - Carreira-Perpiñán: "The Elastic Embedding algorithm for dimensionality reduction". ICML, 2010. Code.
- Globerson et al: "Euclidean embedding of co-occurrence data". JMLR, 2007. Code.
- Iwata et al: "Parametric embedding for class visualization". Neural Computation, 2007.
- Venna et al: "Information retrieval perspective to nonlinear dimensionality reduction for data visualization". JMLR, 2010.

- Manifold denoising (Oct. 19):
- Taubin: "A signal processing approach to fair surface design". SIGGRAPH, 1995.

Review: Taubin: "Geometric signal processing on polygonal meshes". Eurographics'2000: State of the Art Reports. Code. - Desbrun et al: "Implicit fairing of irregular meshes using diffusion and curvature flow". SIGGRAPH, 1999.
- Hein and Maier: "Manifold denoising". NIPS, 2007 . Code.
- Wang and Carreira-Perpiñán: "Manifold Blurring Mean Shift algorithms for manifold denoising". CVPR, 2010. Code.

- Taubin: "A signal processing approach to fair surface design". SIGGRAPH, 1995.
- Missing data (Oct. 21):
- Scholz et al: "Non-linear PCA: A missing data approach". Bioinformatics, 2005. Code.
- Carreira-Perpiñán and Lu: "Manifold learning and missing data recovery through unsupervised regression". ICDM, 2011. Code.

*k*-dimensional space graphically". JASA, 1973. Code. - Manifold learning in speech processing (Nov. 1):
- Lu and Dang: "Vowel production manifold: intrinsic factor analysis of vowel articulation". IEEE Trans. ASL, 2010.
- Jafari and Almasganj: "Using Laplacian Eigenmaps Latent Variable Model and manifold learning to improve speech recognition accuracy". Speech Communication, 2010.
- Kumar and Andreou: "Heteroscedastic discriminant analysis and reduced rank HMMs for improved speech recognition". Speech Communication, 1998.
- Jansen and Niyogi: "Intrinsic Fourier analysis on the manifold of speech sounds". ICASSP 2006.

Longer version: Jansen and Niyogi: "A geometric perspective on speech sounds". Tech. Rep., 2005. - Ananthakrishnan et al: "Predicting unseen articulations from multi-speaker articulatory models". Interspeech, 2010.

- Canonical correlation analysis (CCA), homogeneity analysis (Nov. 4):
- Hardoon et al: "Canonical correlation analysis: An overview with application to learning methods. Neural Computation, 2004. Code for CCA.
- Michailidis and de Leeuw: "The Gifi system of descriptive multivariate analysis". Stat. Sci., 1998.

- Other methods (Nov. 17):
- Carreira-Perpiñán and Lu: "Parametric Dimensionality Reduction by Unsupervised Regression". CVPR, 2010. Code.
- Mordohai and Medioni: "Dimensionality estimation, manifold learning and function approximation using tensor voting". JMLR, 2010. Code.
- Yu et al: "Nonlinear learning using local coordinate coding". NIPS, 2010.

Yu and Zhang: "Improved local coordinate coding using local tangents". ICML, 2010. - Freund et al: "Learning the structure of manifolds using random projections". NIPS, 2008. Code.

Longer version: Dasgupta and Freund: "Random projection trees for vector quantization". IEEE Trans. IT, 2009. - Ailon and Chazelle: "Faster dimension reduction". CACM, 2010. Perspective.
- Li: "Sliced inverse regression for dimension reduction". JASA, 1991. Code.

Kim and Pavlovic: "Dimensionality reduction using covariance operator inverse regression". CVPR, 2008.

- Sparse PCA (Nov. 24):
- Kaiser: "The varimax criterion for analytic rotation in factor analysis". Psychometrika, 1958. Code.
- Zou et al: "Sparse principal component analysis". J. Comp. Graph. Stat., 2006.
- Thiao et al: "A DC programming approach for sparse eigenvalue problem". ICML, 2010.

- Estimation of the intrinsic dimensionality (Nov. 30):
- Strogatz: "Nonlinear dynamics and chaos" (ch. 11: "Fractals"). Westview Press, 1994.
- Fukunaga and Olsen: "An algorithm for finding intrinsic dimensionality of data". IEEE Trans. Computers, 1971.
- Pettis et al: "Intrinsic dimensionality estimator from near-neighbor information". IEEE Trans. PAMI, 1979.
- Grassberger and Procaccia: "Characterization of strange attractors". Phys. Rev. Lett., 1983. Code and Lorenz dataset.

Longer version: Grassberger and Procaccia: "Measuring the strangeness of strange attractors". Phys. Rev. E, 1983. - Kégl: "Intrinsic dimension estimation using packing numbers". NIPS, 2003. Code.
- Costa and Hero: "Geodesic entropic graphs for dimension and entropy estimation in manifold learning". IEEE Trans. Signal Proc., 2004. Code.
- Levina and Bickel: "Maximum likelihood estimation of intrinsic dimension". NIPS, 2005. Code.

- Matrix identities (handy formulas for matrix derivatives, inverses, etc.):

If you have never used Matlab, there are many online tutorials, for example:

Miguel A. Carreira-Perpinan Last modified: Sat Oct 1 21:23:21 PDT 2011