- Lecturer:Faculty of Computer Science / Big Data and Information Retrieval School
- Research Fellow:Faculty of Computer Science / Big Data and Information Retrieval School / Centre of Deep Learning and Bayesian Methods
- Junior Research Fellow:Faculty of Computer Science / Big Data and Information Retrieval School / Research Laboratory for Data Analysis in Financial Technologies
- Kirill Struminsky has been at HSE University since 2016.
To conduct research in the area of deep generative modeling and related areas of deep learning.
- Research Seminar "Machine Learning and Applications" (Bachelor’s programme; Faculty of Computer Science; 3 year, 1-4 module)Rus
- Research Seminar "Machine Learning and Applications 2" (Bachelor’s programme; Faculty of Computer Science; 4 year, 1-3 module)Rus
- Chapter Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-Variance Black-Box Gradient Estimates for the Plackett-Luce Distribution, in: Thirty-Fourth AAAI Conference on Artificial Intelligence Vol. 34. Palo Alto, California USA: AAAI Press, 2020. P. 10126-10135. doi
- Article Struminsky K., Vetrov D. A Simple Method to Evaluate Support Size and Non-uniformity of a Decoder-Based Generative Model // Lecture Notes in Computer Science. 2019. Vol. 11832. P. 81-93. doi
- Preprint Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-variance Gradient Estimates for the Plackett-Luce Distribution / Bayesian Deep Learning NeurIPS 2019 Workshop. Series 2019 "Bayesian Deep Learning NeurIPS 2019 Workshop". 2019.
- Chapter Struminsky K., Lacoste-Julien S., Osokin A. Quantifying Learning Guarantees for Convex but Inconsistent Surrogates, in: Advances in Neural Information Processing Systems 31 (NIPS 2018). , 2018. P. 1-9.
- Article Фигурнов М. В., Струминский К. А., Ветров Д. П. Устойчивый к шуму метод обучения вариационного автокодировщика // Интеллектуальные системы. Теория и приложения. 2017. Т. 21. № 2. С. 90-109.
- Chapter Struminsky K., Kruglik S., Vetrov D., Oseledets I. A new approach for sparse Bayesian channel estimation in SCMA uplink systems, in: 2016 8th International Conference on Wireless Communications and Signal Processing, WCSP 2016. October 13 - 15, Yangzhou, China. NY : Institute of Electrical and Electronic Engineers, 2016. P. 1-5. doi
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.
One paper will be presented at AISTATS (Japan, April 2019) and three papers will be presented at ICLR (USA, May 2019).
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.