• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Bayesian Methods for Data Analysis

2023/2024
Academic Year
ENG
Instruction in English
3
ECTS credits
Delivered at:
Joint Department with Sberbank ‘Financial Technologies and Data Analysis’
Course type:
Compulsory course
When:
2 year, 1 module

Instructor

Course Syllabus

Abstract

This course introduces the basic theoretical and applied principles of Bayesian statistical analysis in a manner geared toward students in the social sciences. The Bayesian paradigm is particularly useful for the type of data that social scientists encounter given its recognition of the mobility of population parameters, its ability to incorporate information from prior research, and its ability to update estimates as new data are observed. The course consists of three main sections: a Bayesian approach to probability theory, sampling methods, and major types of generative models.
Learning Objectives

Learning Objectives

  • Mastering the Bayesian approach to probability theory and the main ways to apply it to machine learning problems
  • Acquisition of skills in building probabilistic models, deriving the necessary formulae for solving learning and inference problems within the framework of the built probabilistic models, as well as effective implementation of these models on the computer
Expected Learning Outcomes

Expected Learning Outcomes

  • Know the main Bayesian models used to solve various machine learning problems (mixture distributions, Relevant Vector Model, etc.)
  • Know the basic methods of generating a sample from a non-normalised probability distribution
  • Know the basic methods of learning and inference in probabilistic models (exact and approximate)
  • Know how to choose the appropriate learning method for the given models
  • Know how to derive the necessary formulas for solving learning and inference problems within the framework of constructed probabilistic models
  • Be able to build probabilistic models that take into account the structure of the applied machine learning problem
  • Be able to efficiently implement these models on a computer
Course Contents

Course Contents

  • A Bayesian approach to probability theory. Full Bayesian inference
  • Bayesian model selection. Probabilistic interpretation of linear and logistic regression models
  • EM algorithm
  • Variational approach
  • Monte Carlo methods for Markov chains (MCMC)
  • Stochastic variational inference. Variational autoencoder
  • Diffusion models. Normalization flows. Score-matching
Assessment Elements

Assessment Elements

  • non-blocking Homework assignment
  • blocking Exam
Interim Assessment

Interim Assessment

  • 2023/2024 1st module
    0.4 * Exam + 0.6 * Homework assignment
Bibliography

Bibliography

Recommended Core Bibliography

  • Christopher M. Bishop. (n.d.). Australian National University Pattern Recognition and Machine Learning. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsbas&AN=edsbas.EBA0C705
  • Mehryar Mohri, Afshin Rostamizadeh, & Ameet Talwalkar. (2018). Foundations of Machine Learning, Second Edition. The MIT Press.

Recommended Additional Bibliography

  • Michael E. Tipping, & Alex Smola. (2001). Sparse Bayesian Learning and the Relevance Vector Machine. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsbas&AN=edsbas.E06406A3
  • Tipping, M. E. (2001). Sparse Bayesian Learning and the Relevance Vector Machine. Journal of Machine Learning Research, 1(3), 211–244. https://doi.org/10.1162/15324430152748236