• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
2019/2020

Цепи Маркова

Лучший по критерию «Полезность курса для расширения кругозора и разностороннего развития»
Статус: Дисциплина общефакультетского пула
Когда читается: 1, 2 модуль
Язык: английский
Кредиты: 3

Course Syllabus

Abstract

Markov chains form the simplest class of random processes for which the future does not depend on the past but depends only on the present state of the process. Being rather simple, at the same time Markov chains have deep and very beautiful mathematics. They are known as probably the most important class of random processes, in particular, because of the numerous applications in mathematics, physics, computer science, biology, economics, etc. Indeed, once a stochastic process is given, it is natural to simplify it by assuming that the future does not depend on the past, and often this approximation works well. The present course is the introduction to the theory of Markov chains. It will concern with their most important properties and the most known applications. The course is aimed at the 3rd and 4th year students, but is also possible for 1st and 2nd year students. The only required knowledge is the basic course of analysis and linear algebra.
Learning Objectives

Learning Objectives

  • Learning the audience what are the Markov chains with finite number of states and the corresponding basic technique.
  • Learning possible types of large-time behavior of the Markov chains with finite number of states.
  • Learning some applications of the Markov chains technique to various examples arising in different areas.
Expected Learning Outcomes

Expected Learning Outcomes

  • After the course the students are expected to understand what is a Markov chain with finite number of states, to know its basic properties, possible types of its large-time behavior and possible topological structure.
  • The students are expected to be familiar with a number of examples of Markov chains with finite number of states and to be able to recognize if a given discrete stochastic system with finite number of states is a Markov chain or not and apply the Markov chain technique to study this system if the system forms a Markov chain.
Course Contents

Course Contents

  • Markov chains with finite number of states
  • Examples
  • Stationary states and their existence
  • Ergodic theorem for Markov chains with ergodic transition probability matrix
  • Applications of the ergodic theorem. The law of large numbers for Markov chains. The Google’s Page Rank.
  • Perron–Frobenius theorem
  • Topological structure of Markov Chains
  • Periodic Markov chains.
  • Aperiodic Markov chains. Ergodic theorem for irreducible aperiodic Markov chains
Assessment Elements

Assessment Elements

  • non-blocking current grade
    solving sessions
  • non-blocking Final exam
    Written exam
Interim Assessment

Interim Assessment

  • Interim assessment (2 module)
    0.5 * current grade + 0.5 * Final exam
Bibliography

Bibliography

Recommended Core Bibliography

  • Gnedenko, B. V., & Ushakov, I. A. (1997). Theory of Probability (Vol. Sixth edition). Amsterdam, the Netherlands: Routledge. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=1835590
  • Meyn, S. P., & Tweedie, R. L. (2009). Markov Chains and Stochastic Stability (Vol. 2nd ed). Cambridge: Cambridge University Press. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=313161

Recommended Additional Bibliography

  • Ibe, O. C. (2013). Markov Processes for Stochastic Modeling (Vol. 2nd edition). Chennai: Elsevier. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=516132