Course details

Bayesian Models for Machine Learning (in English)

BAYa Acad. year 2024/2025 Winter semester 5 credits

Current academic year

Probability theory and probability distributions, Bayesian Inference, Inference in Bayesian models with conjugate priors, Inference in Bayesian Networks, Expectation-Maximization algorithm, Approximate inference in Bayesian models using Gibbs sampling, Variational Bayes inference, Stochastic VB, Infinite mixture models, Dirichlet Process, Chinese Restaurant Process, Pitman-Yor Process for Language modeling, Practical applications of Bayesian inference

Guarantor

Course coordinator

Language of instruction

English

Completion

Examination

Time span

  • 26 hrs lectures
  • 13 hrs seminar
  • 13 hrs projects

Assessment points

  • 51 pts final exam (written part)
  • 24 pts mid-term test (written part)
  • 25 pts projects

Department

Lecturer

Instructor

Learning objectives

To demonstrate the limitations of Deep Neural Nets (DNN) that have become a very popular machine learning tool successful in many areas, but that excel only when sufficient amount of well annotated training data is available. To present Bayesian models (BMs) allowing to make robust decisions even in cases of scarce training data as they take into account the uncertainty in the model parameter estimates. To introduce the concept of latent variables making BMs modular (i.e. more complex models can be built out of simpler ones) and well suitable for cases with missing data (e.g. unsupervised learning when annotations are missing). To introduce basic skills and intuitions about the BMs and to develop more advanced topics such as: approximate inference methods necessary for more complex models, infinite mixture models based on non-parametric BMs. The course is taught in English.

Study literature

Syllabus of lectures

  1. Probability theory and probability distributions 
  2. Bayesian Inference (priors, uncertainty of the parameter estimates, posterior predictive probability) 
  3. Inference in Bayesian models with conjugate priors 
  4. Inference in Bayesian Networks (loopy belief propagation) 
  5. Expectation-Maximization algorithm (with application to Gaussian Mixture Model) 
  6. Approximate inference in Bayesian models using Gibbs sampling 
  7. Variational Bayes inference 
  8. Infinite mixture models, Dirichlet Process, Chinese Restaurant Process 
  9. Pitman-Yor Process for Language modeling 
  10. Practical applications of Bayesian inference

Syllabus of seminars

Lectures will be immediately followed by demonstration exercises where examples in Python will be presented. Code and data of all demonstrations will be made available to the students and will constitute the basis for the project.

Syllabus - others, projects and individual work of students

The project will follow on the demonstration exercises and will make the student work on provided (simulated or real) data. The students will work in teams in "evaluation" mode and present their results at the final lecture/exercise.

Progress assessment

  • Mid-term exam (24 points)  

  • Submission and presentation of project (25 points) 

  • Final exam (51points) 

To get points from the exam, you need to get min. 20 points, otherwise the exam is rated 0 points.


Schedule

DayTypeWeeksRoomStartEndCapacityLect.grpGroupsInfo
Fri lecture lectures G202 15:0016:5080 1EIT 1MIT 2EIT 2MIT INTE NMAL xx Diez
Fri seminar lectures G202 17:0017:5080 1EIT 1MIT 2EIT 2MIT INTE NMAL xx

Course inclusion in study plans

Back to top