NewsCurrently, no news are available
Probabilistic Machine Learning
The probabilistic machine learning framework describes how to represent and manipulate uncertainty about models and predictions, and has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This course aims to provide an introduction to the general framework of probabilistic modeling and inference. To this end, the course will provide an introduction to generative models for unsupervised learning, such as clustering and topic modeling; and to standard Bayesian inference methods, including both MCMC and variational inference methods.
Prerequisites. Ideally, participating students should have successfully completed an introductory course on machine learning and statistical learning. However, the course is self-contained and any student with a solid basis in linear algebra should in principle be able to follow.
Lectures: Wednesdays at 16:15pm (on Zoom) -- starting on November 4
Tutorials: Mondays at 14:15pm (on Zoom) -- (tentatively) starting on November 16
0. Introduction to probabilistic machine learning (Nov. 4)
Reference: Ghahramani, Zoubin. "Probabilistic machine learning and artificial intelligence." Nature 521.7553 (2015): 452-459.
1. Introduction to probabilistic machine learning (Nov. 11)
Reference: Chapter 2 up to Section 2.3.6 and Section 8.2 of Bishop
BLOCK I: Monte Carlo Methods
2. Gaussian Mixture Model (GMM) + Expectation Maximization (Nov. 18)
Reference: Section 9.2 of Bishop
3. Bayesian GMM + Gibbs Sampling (Nov. 25)
4. DP and infinite Mixture Models (iMMs) (De. 2)
5. Hidden Markov Models (HMMs) (Dec. 9)
6. Temporal point Processes (TPPs) (Dec. 16)
BLOCK II: Variational Inference
7. GMMs + Variational Inference (VI) (Jan. 13)
8. VI + LDA (Jan. 20)
9. Advanced VI (part I): Stochastic VI and Black Box VI (Jan. 27)
10. Advanced VI (part II): Amortized Inference and Variational Autoencoders (Feb. 3)