I design machine learning algorithms that try to solve some of today's most challenging problems in computer science and statistics.

I adapt ideas from physics and the statistical sciences, and use them in algorithms that can be applied to areas such as: bioinformatics, artificial intelligence, pattern recognition, document information retrieval, and human-computer interaction.

Click on the following topics to see research descriptions and some papers:-

Nonparametric Bayes - powerful nonparametric text/document modelling
Variational Bayesian Methods - approximate Bayesian learning and inference
Bioinformatics - microarray analysis using variational Bayes
Embedded Hidden Markov Models - a novel tool for time series inference
Probabilistic Sensor Fusion - combining modalities using Bayesian graphical models
Collaborators - people I have worked with


Hierarchical Dirichlet Process theory

Some of my past work investigated a non-trivial extension of the Hidden Markov Model to one having a countably infinite number of hidden states --- so the model selection question "how many hidden states does my HMM need?" is moot. This was achieved by inventing a Hierarchical Dirichlet Process construction (HDP) and introducing an oracle to stitch together the infinity of hidden states. Recent research has shown that, whilst the insight and intuition behind the HDP is sound and correct, there nevertheless exists an elegant and powerful description of the model in terms of a generative Bayesian nonparametric model. Current research is aimed at understanding this model and deriving efficient MCMC algorithms for inference and learning its hyperparameters. This is a good example of a nonparametric Bayesian model, the subject of a recent NIPS workshop that I co-organised with Yee Whye Teh.

Current applications of HDPs include language modelling and document understanding, and in the future we may also apply them to bioinformatics and linguistics problems.