At the end of this course, the student
- is able to formulate a probabilistic model for regression, classification, and density estimation
- understands the principle of maximum likelihood estimation as well as the full Bayesian approach
- is able to derive algorithms using these principles for a wide class of models, such as the mutlivariate Gaussian models, linear models for regression, logistic regression, and Gaussian mixtures (the EM algorithm), as well as kernel, variational, and sampling methods
- is able to understand and implement mathematically described methods from modern statistical machine learning
|
|
Machine learning is concerned with methods for decision taking based on data. In statistical machine learning, these methods are based on probabilistic models and statistical inference methods, including the maximum likelihood estimate and Bayesian learning. These methods have a wide variety of applications such as visual object recognition, analysis of genetic data, financial data or neuroscience data etc.
In this course we provide a principled treatment of the basic models and methods from statistical machine learning. This requires a certain mathematical depth, but we will take ample time to acquire the necessary mathematical knowledge and skills using exercises, (computer) assignments, and optionally student projects on more advanced state-of-the-art machine learning methods (such as Gaussian processes, kernel methods, Bayesian model selection, the EM algorithm, variational methods, and Markov Chain Monte Carlo).
Instructional Modes
- Lecture
- Tutorial
- Self-study
|
|
|
"Wiskunde 1 and 2 voor Kunstmatige Intelligentieā or equivalent (basic calculus and linear algebra) |
|
- Written exam (open book)
- Group assignments
|
|
This course is intended for master students in Computer Science and Artificial Intelligence. Students in Physics and Mathematics who are interested in this topic are advised to follow the course Machine Learning in the Masters' phase.
|
|