Sergey Samsonov
- Lecturer:Faculty of Computer Science / Big Data and Information Retrieval School
- Research Assistant, Doctoral Student:Faculty of Computer Science / International Laboratory of Stochastic Algorithms and High-Dimensional Inference
- Sergey Samsonov has been at HSE University since 2018.
Education
Bachelor's in Applied Mathematics and Information Science
Lomonosov Moscow State University

Young Faculty Support Program (Group of Young Academic Professionals)
Category "New Researchers" (2020-2021)
Postgraduate Studies
3rd year of study
Approved topic of thesis: Concentration inequalities for functionals of Markov chains and applications to variance reduction in MCMC
Academic Supervisor: Naumov, Alexey
Courses (2021/2022)
- Calculus 2 (Bachelor’s programme; Faculty of Computer Science; 2 year, 1-4 module)Rus
- Mathematical Foundations of Reinforcement learning (Master’s programme; Faculty of Computer Science; 2 year, 2 module)Eng
- Modern Methods of Data Analysis: Stochastic Calculus (Master’s programme; Faculty of Computer Science; 1 year, 1, 2 module)Eng
- Past Courses
Courses (2020/2021)
- Calculus 2 (Bachelor’s programme; Faculty of Computer Science; 2 year, 1-4 module)Rus
- Modern Methods of Data Analysis: Stochastic Calculus (Master’s programme; Faculty of Computer Science; 1 year, 1, 2 module)Eng
Publications10
- Article Belomestny D., Moulines E., Samsonov S. Variance reduction for additive functionals of Markov chains via martingale representations // Statistics and Computing. 2022. Vol. 32. No. 1. Article 16. doi
- Chapter Durmus A., Moulines E., Naumov A., Samsonov S., Wai H. On the Stability of Random Matrix Product with Markovian Noise: Application to Linear Stochastic Approximation and TD Learning, in: Proceedings of Machine Learning Research Vol. 134: Conference on Learning Theory. PMLR, 2021. P. 1711-1752.
- Article Durmus A., Moulines E., Naumov A., Samsonov S. Probability and moment inequalities for additive functionals of geometrically ergodic Markov chains // Working papers by Cornell University. Series math "arxiv.org". 2021. No. 2109.00331
- Article Беломестный Д. В., Moulines E., Naumov A., Puchkin N., Samsonov S. Rates of convergence for density estimation with GANs // Working papers by Cornell University. Series math "arxiv.org". 2021. No. 2102.00199. P. 1-27.
- Chapter Durmus A., Moulines E., Naumov A., Samsonov S., Scaman K., Wai H. Tight High Probability Bounds for Linear Stochastic Approximation with Fixed Stepsize, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 30063-30074.
- Article Беломестный Д. В., Levin I., Moulines E., Naumov A., Samsonov S., Zorina V. UVIP: Model-Free Approach to Evaluate Reinforcement Learning Algorithms // Working papers by Cornell University. Series math "arxiv.org". 2021. Article 2105.02135.
- Article Беломестный Д. В., Iosipoi L., Moulines E., Naumov A., Samsonov S. Variance reduction for dependent sequences with applications to Stochastic Gradient MCMC // SIAM-ASA Journal on Uncertainty Quantification. 2021. Vol. 9. No. 2. P. 507-535. doi
- Article Беломестный Д. В., Moulines E., Iosipoi L., Naumov A., Samsonov S. Variance reduction for Markov chains with application to MCMC // Statistics and Computing. 2020. No. 30. P. 973-997. doi
- Article Samsonov S., Ushakov N., Ushakov V. Estimation of the Second Moment Based on Rounded Data // Journal of Mathematical Sciences. 2019. Vol. 237. No. 6. P. 819-825. doi
- Chapter Samsonov S., Ushakov N., Ushakov V. Consistent variance estimation based on rounded data, in: Book of abstracts of XXXIV International Seminar on Stability Problems for Stochastic Models. August 25–29, 2017. Debrecen, Hungary. , 2017. P. 121-121. (in press)
Conferences
- 2020
Math of Machine Learning (HDI Lab Summer School) (Sochi). Presentation: Variance reduction for MCMC methods via martingale representations
- 2019
New frontiers in high-dimensional probability and statistics 2 (Москва). Presentation: Concentration Inequalities for Functionals of Markov Chains with Applications to Variance Reduction
Structural Inference in High-Dimensional Models 2 (Пушкин). Presentation: Variance Reduction for Dependent Sequences via Empirical Variance Minimisation
SDEs/SPDEs: Theory, Numerics and their interplay with Data Science (Ираклион). Presentation: Variance reduction for dependent sequences via empirical variance minimisation
Faculty Submits Ten Papers to NeurIPS 2021
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.