• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
Аспирантура 2019/2020

Порождающие модели в машинном обучении

Статус: Курс по выбору
Направление: 02.06.01. Компьютерные и информационные науки
Когда читается: 2-й курс, 2 семестр
Формат изучения: без онлайн-курса
Преподаватели: Кертес-Фаркаш Аттила
Язык: английский
Кредиты: 4
Контактные часы: 36

Course Syllabus

Abstract

This course gives an introduction to generative models which aim at learning a probabilistic representation of the data (distribution). After completing the study of the discipline the PhD student should have knowledge about probabilistic models, knowledge about modern methods such as deep learning techniques, knowledge about ongoing developments in Machine Learning, hands-on experience with large scale machine learning problems, knowledge about how to design and develop machine learning programs using a programming language such as R or Python, and be able to think critically with real data.
Learning Objectives

Learning Objectives

  • The learning objective of the course “Generative methods in Machine Learning” is to provide students advanced techniques and deeper theoretical and practical knowledge in modern probabilistic learning techniques, such as: <ul> <li> Basic principles, Generative Models,</li> <li> Bayesian Network, Random Markov Fields, Boltzmann Machines, Variational Auto Encoders</li> <li> Sampling and Inference, Variational inference, variational methods, </li> <li> Neural Networks,</li> <li> Deep Learning techniques.</li> </ul>
Expected Learning Outcomes

Expected Learning Outcomes

  • Students are introduced to probabilistic modelling of data.
  • Student are introduced to generalization and theory of the basic methods.
  • Students are introduced to generalization and theory of the basic methods.
  • Students are introduced to generating new data from probabilistic distributions.
  • Students know variational methods for learning to represent data distribution.
  • Students know state-of-the-art methods currently used in data generation processes.
  • Students are introduced to deep generative models such as deep belief networks, etc.
Course Contents

Course Contents

  • Introduction to machine learning, Bayesian Decision Theory, Maximum Likelihood Estimation, and EM
    Basic definitions, principles and types of machine learning. Classifiers, Discriminant Functions, and Decision Surfaces, Minimum-Error-Rate Classification, Neyman-Pearson lemma, Distributions, Rela-tion to Logistic Regression, Naïve Bayes classification, basics of MLE, learning parameters of distri-butions. Gaussian Mixture Models, Latent Variables, Examples, Expectation-Maximization, Latent Dirichlet Allocation.
  • Exponential Family, Sufficient Statistics
    Generalized Linear Models.
  • Graphical Models and Generative Learning
    Bayesian Networks, Random Markov Fields, Conditional Random Fields, Boltzmann Machines, Energy-based methods. Hidden Markov Models.
  • Sampling and Inference
    Exact and Inexact Inference, Gibbs sampling, Bridge Sampling, Simple and Annealed Importance Sampling, Monte-Carlo EM, Junction Tree algorithm.
  • Variational Learning
    Mean-Field, Bethe Approximation, Variational methods, Variational Message Passing, Free-Energy, Variational Free Energy. Variational Bayes, Variational Bayes Expectation-Maximization. Mean field methods.
  • Generative Models
    Restricted Boltzmann Machines, Helmoltz Machines and Wake-Sleep algorithms, Energy-based methods. Generative Adversarial Networks, Generative Auto-Encoders, Belief networks, connectionist learning. Variational AutoEncoders.
  • Deep learning techniques
    Neural Networks, Shallow networks, Multilayer Neural networks, back-propagation, deep learning, Universal Approximation. Auto Encoders, Stacked Auto-Encoders, Stacked Boltzmann machines, supervised and unsupervised pre-training, Deep Belief Networks. Deep Universality theorems.
Assessment Elements

Assessment Elements

  • non-blocking Presence
  • non-blocking Exam
    Written exam. Preparation time – 180 min.
Interim Assessment

Interim Assessment

  • Interim assessment (2 semester)
    0.7 * Exam + 0.3 * Presence
Bibliography

Bibliography

Recommended Core Bibliography

  • Wainwright, M. J., & Jordan, M. I. (2008). Graphical Models, Exponential Families, and Variational Inference. Boston: Now Publishers. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=nlebk&AN=352768

Recommended Additional Bibliography

  • James, G. et al. An introduction to statistical learning. – Springer, 2013. – 426 pp.