- Visiting Scholar:Faculty of Computer Science / Big Data and Information Retrieval School
- Research Assistant:Faculty of Computer Science / Big Data and Information Retrieval School / Centre of Deep Learning and Bayesian Methods
- Research Assistant:Faculty of Computer Science / Big Data and Information Retrieval School / Research Laboratory for Data Analysis in Financial Technologies
- Kirill Struminsky has been at HSE since 2016.
To conduct research in the area of deep generative modeling and related areas of deep learning.
- Chapter Struminsky K., Lacoste-Julien S., Osokin A. Quantifying Learning Guarantees for Convex but Inconsistent Surrogates, in: Advances in Neural Information Processing Systems 31 (NIPS 2018). , 2018. P. 1-9.
- Article Фигурнов М. В., Струминский К. А., Ветров Д. П. Устойчивый к шуму метод обучения вариационного автокодировщика // Интеллектуальные системы. Теория и приложения. 2017. Т. 21. № 2. С. 90-109.
- Chapter Struminsky K., Kruglik S., Vetrov D., Oseledets I. A new approach for sparse Bayesian channel estimation in SCMA uplink systems, in: 2016 8th International Conference on Wireless Communications and Signal Processing, WCSP 2016. October 13 - 15, Yangzhou, China. NY : Institute of Electrical and Electronic Engineers, 2016. P. 1-5. doi
One paper will be presented at AISTATS (Japan, April 2019) and three papers will be presented at ICLR (USA, May 2019).
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.