Sergey Samsonov
- Lecturer:Faculty of Computer Science / Big Data and Information Retrieval School
- Research Fellow, Doctoral Student:Faculty of Computer Science / International Laboratory of Stochastic Algorithms and High-Dimensional Inference
- Sergey Samsonov has been at HSE University since 2018.
Education
Bachelor's in Applied Mathematics and Information Science
Lomonosov Moscow State University

Young Faculty Support Program (Group of Young Academic Professionals)
Category "New Researchers" (2020-2021)
Postgraduate Studies
4th year of study
Approved topic of thesis: Concentration inequalities for functionals of Markov chains and applications to variance reduction in MCMC
Academic Supervisor: Naumov, Alexey
Courses (2022/2023)
- Additional Topics in Probability Theory (Master’s programme; Faculty of Computer Science; 1 year, 1 module)Eng
- Fundamentals of Matrix Computations (Bachelor’s programme; Faculty of Computer Science; 2 year, 3, 4 module)Rus
- Markov Chains (Master’s programme; Faculty of Computer Science; 1 year, 2, 3 module)Eng
- Mathematical Foundations of Reinforcement learning (Master’s programme; Faculty of Computer Science; 2 year, 2 module)Eng
- Mathematical Statistics (advanced course) (Bachelor’s programme; Faculty of Computer Science; 2 year, 3, 4 module)Rus
- Past Courses
Courses (2021/2022)
- Calculus 2 (Bachelor’s programme; Faculty of Computer Science; 2 year, 1-4 module)Rus
- Mathematical Foundations of Reinforcement learning (Master’s programme; Faculty of Computer Science; 2 year, 2 module)Eng
- Modern Methods of Data Analysis: Stochastic Calculus (Master’s programme; Faculty of Computer Science; 1 year, 1, 2 module)Eng
Courses (2020/2021)
- Calculus 2 (Bachelor’s programme; Faculty of Computer Science; 2 year, 1-4 module)Rus
- Modern Methods of Data Analysis: Stochastic Calculus (Master’s programme; Faculty of Computer Science; 1 year, 1, 2 module)Eng
Publications15
- Article Belomestny D., Naumov A., Puchkin N., Samsonov S. Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations // Neural Networks. 2023. Vol. 161. P. 242-253. doi
- Chapter Cardoso G., Samsonov S., Thin A., Moulines E., Olsson J. BR-SNIS: Bias Reduced Self-Normalized Importance Sampling, in: Thirty-Sixth Conference on Neural Information Processing Systems : NeurIPS 2022. Curran Associates, Inc., 2022. P. 716-729.
- Article Durmus A., Moulines E., Naumov A., Samsonov S. Finite-time High-probability Bounds for Polyak-Ruppert Averaged Iterates of Linear Stochastic Approximation // Working papers by Cornell University. Series math "arxiv.org". 2022. Article 2207.04475.
- Chapter Tiapkin D., Belomestny D., Moulines E., Naumov A., Samsonov S., Tang Y., Valko M., Menard P. From Dirichlet to Rubin: Optimistic Exploration in RL without Bonuses, in: Proceedings of the 39th International Conference on Machine Learning Vol. 162. PMLR, 2022. P. 21380-21431.
- Chapter Samsonov S., Lagutin E., Gabrie M., Durmus A., Naumov A., Moulines E. Local-Global MCMC kernels: the best of both worlds, in: Thirty-Sixth Conference on Neural Information Processing Systems : NeurIPS 2022. Curran Associates, Inc., 2022. P. 5178-5193.
- Article Belomestny D., Moulines E., Samsonov S. Variance reduction for additive functionals of Markov chains via martingale representations // Statistics and Computing. 2022. Vol. 32. No. 1. Article 16. doi
- Chapter Durmus A., Moulines E., Naumov A., Samsonov S., Wai H. On the Stability of Random Matrix Product with Markovian Noise: Application to Linear Stochastic Approximation and TD Learning, in: Proceedings of Machine Learning Research Vol. 134: Conference on Learning Theory. PMLR, 2021. P. 1711-1752.
- Article Durmus A., Moulines E., Naumov A., Samsonov S. Probability and moment inequalities for additive functionals of geometrically ergodic Markov chains // Working papers by Cornell University. Series math "arxiv.org". 2021. No. 2109.00331
- Article Belomestny D., Moulines E., Naumov A., Puchkin N., Samsonov S. Rates of convergence for density estimation with GANs // Working papers by Cornell University. Series math "arxiv.org". 2021. No. 2102.00199. P. 1-27.
- Chapter Durmus A., Moulines E., Naumov A., Samsonov S., Scaman K., Wai H. Tight High Probability Bounds for Linear Stochastic Approximation with Fixed Stepsize, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 30063-30074.
- Article Belomestny D., Levin I., Moulines E., Naumov A., Samsonov S., Zorina V. UVIP: Model-Free Approach to Evaluate Reinforcement Learning Algorithms // Working papers by Cornell University. Series math "arxiv.org". 2021. Article 2105.02135.
- Article Belomestny D., Iosipoi L., Moulines E., Naumov A., Samsonov S. Variance reduction for dependent sequences with applications to Stochastic Gradient MCMC // SIAM-ASA Journal on Uncertainty Quantification. 2021. Vol. 9. No. 2. P. 507-535. doi
- Article Belomestny D., Moulines E., Iosipoi L., Naumov A., Samsonov S. Variance reduction for Markov chains with application to MCMC // Statistics and Computing. 2020. No. 30. P. 973-997. doi
- Article Samsonov S., Ushakov N., Ushakov V. Estimation of the Second Moment Based on Rounded Data // Journal of Mathematical Sciences. 2019. Vol. 237. No. 6. P. 819-825. doi
- Chapter Samsonov S., Ushakov N., Ushakov V. Consistent variance estimation based on rounded data, in: Book of abstracts of XXXIV International Seminar on Stability Problems for Stochastic Models. August 25–29, 2017. Debrecen, Hungary. , 2017. P. 121-121. (in press)
Conferences
- 2020
Math of Machine Learning (HDI Lab Summer School) (Sochi). Presentation: Variance reduction for MCMC methods via martingale representations
- 2019
New frontiers in high-dimensional probability and statistics 2 (Москва). Presentation: Concentration Inequalities for Functionals of Markov Chains with Applications to Variance Reduction
Structural Inference in High-Dimensional Models 2 (Пушкин). Presentation: Variance Reduction for Dependent Sequences via Empirical Variance Minimisation
SDEs/SPDEs: Theory, Numerics and their interplay with Data Science (Ираклион). Presentation: Variance reduction for dependent sequences via empirical variance minimisation
Three HSE Researchers Receive Ilya Segalovich Award
Three researchers of the HSE Faculty of Computer Science are among the winners of the 2022 Ilya Segalovich Award: Research Professor Dmitry Vetrov, Associate Professor Alexey Naumov and doctoral student Sergey Samsonov. The award, established by Yandex in 2019, is aimed at supporting young researchers and the scientific community in the field of IT in Russia, Belarus and Kazakhstan.
17 Articles by Researchers of HSE Faculty of Computer Science Accepted at NeurIPS
In 2022, 17 articles by the researchers of HSE Faculty of Computer Science were accepted at the NeurIPS (Conference and Workshop on Neural Information Processing Systems), one of the world’s most prestigious events in the field of machine learning and artificial intelligence. The 36th conference will be held in a hybrid format from November 28th to December 9th in New Orleans (USA).
HSE Faculty of Computer Science and Skoltech Hold Math of Machine Learning Olympiad 2022
HSE's Faculty of Computer Science and the Skolkovo Institute of Science and Technology have held the Mathematics of Machine Learning Olympiad for the fifth time. The participants competed for prizes and the opportunity to matriculate at two universities without exams by enrolling in the HSE and Skoltech joint master's programme in Math of Machine Learning.
Faculty Submits Ten Papers to NeurIPS 2021
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.