Carreira-Perpiñán, M. Á. (2001): *Continuous latent variable models for dimensionality reduction and sequential data reconstruction*. PhD thesis, Dept. of Computer Science, University of Sheffield, UK.

Continuous latent variable models (cLVMs) are probabilistic models that represent a distribution in a high-dimensional Euclidean space using a small number of continuous, latent variables. This thesis explores, theoretically and practically, the ability of cLVMs for dimensionality reduction and sequential data reconstruction.

The first part of the thesis reviews and extends the theory of cLVMs: definition in terms of a prior distribution in latent space, a mapping to data space and a noise model; maximum likelihood parameter estimation with an expectation-maximisation (EM) algorithm; specific cLVMs (factor analysis, principal component analysis (PCA), independent component analysis, independent factor analysis and the generative topographic mapping (GTM)); mixtures of cLVMs; identifiability, interpretability and visualisation; and derivation of mappings for dimensionality reduction and reconstruction and their properties, such as continuity, for each cLVM. We extend GTM to diagonal noise and give a corresponding EM algorithm.

We also describe a discrete LVM for binary data, Bernoulli mixtures, widely used in practice. We show that their log-likelihood surface has no singularities, unlike other mixture models, which makes EM estimation practical; and that their theoretical non-identifiability is rarely realised in actual estimates, which makes them interpretable.

The second part deals with dimensionality reduction. We define the problem and give an extensive, critical review of nonprobabilistic methods for it: linear methods (PCA, projection pursuit), nonlinear autoassociators, kernel methods, local dimensionality reduction, principal curves, vector quantisation methods (elastic net, self-organising map) and multidimensional scaling methods. We then empirically evaluate, in terms of reconstruction error, computation time and visualisation, several latent-variable methods for dimensionality reduction of binary electropalatographic (EPG) data: PCA, factor analysis, mixtures of factor analysers, GTM and Bernoulli mixtures. We compare these methods with earlier, nonadaptive EPG data reduction methods and derive 2D maps of EPG sequences for use in speech research and therapy.

The last part of this thesis proposes a new method for missing data reconstruction of sequential data that includes as particular case the inversion of many-to-one mappings. We define the problem, distinguish it from inverse problems, and show when both coincide. The method is based on multiple pointwise reconstruction and constraint optimisation. Multiple pointwise reconstruction uses a Gaussian mixture joint density model for the data, conveniently implemented with a nonlinear cLVM (GTM). The modes of the conditional distribution of missing values given present values at each point in the sequence represent local candidate reconstructions. A global sequence reconstruction is obtained by efficiently optimising a constraint, such as continuity or smoothness, with dynamic programming. We give a probabilistic interpretation of the method. We derive two algorithms for exhaustive mode finding in Gaussian mixtures, based on gradient-quadratic search and fixed-point search, respectively; as well as estimates of error bars for each mode and a measure of distribution sparseness. We discuss the advantages of the method over previous work based on the conditional mean or on universal mapping approximators (including ensembles and recurrent networks), conditional distribution estimation, vector quantisation and statistical analysis of missing data. We study the performance of the method with synthetic data (a toy example and an inverse kinematics problem) and real data (mapping between EPG and acoustic data). We describe the possible application of the method to several well-known reconstruction or inversion problems: decoding of neural population activity for hippocampal place cells; wind field retrieval from scatterometer data; inverse kinematics and dynamics of a redundant manipulator; acoustic-to-articulatory mapping; audiovisual mappings for speech recognition; and recognition of occluded speech.

- Introduction
- The continuous latent variable modelling formalism
- Some properties of finite mixtures of multivariate Bernoulli distributions
- Dimensionality reduction
- Dimensionality reduction of electropalatographic (EPG) data
- Inverse problems and mapping inversion
- Sequential data reconstruction
- Exhaustive mode finding in Gaussian mixtures
- Experiments with synthetic data
- Experiments with real-world data: the acoustic-to-articulatory mapping problem
- Conclusions
- Appendices

Continuous latent variable models, generative models, generative topographic mapping (GTM), identifiability, Bernoulli mixtures, electropalatography, dimensionality reduction, inverse problems, missing data reconstruction, sequential data reconstruction, mapping inversion, conditional distribution, mode finding, continuity constraints, distribution sparseness, universal mapping approximators, vector quantisation, inverse kinematics, acoustic-to-articulatory mapping.

You can download the full thesis (333 pages, 130 figures, 24 tables, 445 references) in the following formats:

- For one-sided printing, either in PDF (5.5M) or in gzipped PostScript (4.0M).
- For two-sided printing, either in PDF (5.5M) or in gzipped PostScript (4.0M).

All files are in European size A4, but should print fine in US letter size (if not, you can always use "fit to page" from e.g. Acroread to print the PDF file). Although many pages contain colour figures, they should be readable when printed in black and white except perhaps some of those in chapters 9 and 10.

The following chapters, available separately, contain tutorial material:

- Chapter 2:
*The continuous latent variable modelling formalism*.

This contains a review of continuous latent variable models: probabilistic principal component analysis (PCA), factor analysis, the generative topographic mapping (GTM), independent component analysis (ICA), mixtures of latent variable models, etc. It also deals with issues such as parameter estimation, identifiability, interpretability, visualisation, and dimensionality reduction with continuous latent variable models.

[paper PDF] [paper PS] - Chapter 4:
*Dimensionality reduction*.

This contains a review of dimensionality reduction with nonprobabilistic methods (probabilistic methods, i.e., latent variable models, are reviewed in chapter 2): nonlinear autoassociators, kernel PCA, principal curves, vector quantisation, multidimensional scaling, Isomap, LLE, etc. It also reviews issues such as the curse of dimensionality and the intrinsic dimensionality.

[paper PDF] [paper PS] - Chapter 6:
*Inverse problems and mapping inversion*.

This contains a review of inverse problem theory and its methods (Bayesian and nonprobabilistic), and its relation with latent variable models.

[paper PDF] [paper PS] - Chapter 10 (partially):
*The acoustic-to-articulatory mapping problem*.

This contains a review of the acoustic-to-articulatory mapping problem of speech research (the recovery of the acoustic waveform given the vocal tract configuration), emphasising its potential role in improving automatic speech recognition.

[paper PDF] [paper PS]

A BibTeX file containing the bibliography for the thesis (.tar.gz, 445 references, 164K). Most of the entries contain also the abstract, keywords, URL (if the reference is available online), related software and other additional information.

Matlab implementation of several of the models and algorithms discussed:

- Factor analysis
- Principal component analysis
- Mixture of multivariate Bernoulli distributions
- Mode-finding in Gaussian mixtures (including the calculation of error bars, gradient and Hessian)
- Tools for imaging EPG frames

Miguel A. Carreira-Perpinan Last modified: Fri Jun 24 16:33:13 PDT 2016