Kirill Struminsky
- Lecturer:Faculty of Computer Science / Big Data and Information Retrieval School
- Research Fellow:Faculty of Computer Science / Big Data and Information Retrieval School / Centre of Deep Learning and Bayesian Methods
- Junior Research Fellow:Faculty of Computer Science / Big Data and Information Retrieval School / Research Laboratory for Data Analysis in Financial Technologies
- Kirill Struminsky has been at HSE University since 2016.
Responsibilities
To conduct research in the area of deep generative modeling and related areas of deep learning.
Student Term / Thesis Papers
- Bachelor
D. Rakitin, Neurally-Guided Program Induction. Faculty of Computer Science, 2020
I. Ponamareva, Learning to Rank with Variational Optimization. Faculty of Computer Science, 2020
Courses (2020/2021)
- Research Seminar "Machine Learning and Applications 2" (Bachelor’s programme; Faculty of Computer Science; 4 year, 1-3 module)Rus
- Past Courses
Courses (2019/2020)
Courses (2018/2019)
Publications8
- Chapter Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-Variance Black-Box Gradient Estimates for the Plackett-Luce Distribution, in: Thirty-Fourth AAAI Conference on Artificial Intelligence Vol. 34. Palo Alto, California USA: AAAI Press, 2020. P. 10126-10135. doi
- Article Struminsky K., Vetrov D. A Simple Method to Evaluate Support Size and Non-uniformity of a Decoder-Based Generative Model // Lecture Notes in Computer Science. 2019. Vol. 11832. P. 81-93. doi
- Preprint Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-variance Gradient Estimates for the Plackett-Luce Distribution / Bayesian Deep Learning NeurIPS 2019 Workshop. Series 2019 "Bayesian Deep Learning NeurIPS 2019 Workshop". 2019.
- Chapter Atanov A., Ashukha A., Struminsky K., Vetrov D., Welling M. The Deep Weight Prior, in: Proceedings of the 7th International Conference on Learning Representations (ICLR 2019). ICLR, 2019. P. 1-17.
- Chapter Struminsky K., Lacoste-Julien S., Osokin A. Quantifying Learning Guarantees for Convex but Inconsistent Surrogates, in: Advances in Neural Information Processing Systems 31 (NIPS 2018). , 2018. P. 1-9.
- Article Фигурнов М. В., Струминский К. А., Ветров Д. П. Устойчивый к шуму метод обучения вариационного автокодировщика // Интеллектуальные системы. Теория и приложения. 2017. Т. 21. № 2. С. 90-109.
- Chapter Struminsky K., Kruglik S., Vetrov D., Oseledets I. A new approach for sparse Bayesian channel estimation in SCMA uplink systems, in: 2016 8th International Conference on Wireless Communications and Signal Processing, WCSP 2016. October 13 - 15, Yangzhou, China. NY : Institute of Electrical and Electronic Engineers, 2016. P. 1-5. doi
- Preprint Figurnov M., Struminsky K., Vetrov D. Robust Variational Inference / Cornell University. Series arXiv:1611.09226 "arxiv.org". 2016.
The faculty members will present their research on ICLR and AISTATS conferences
One paper will be presented at AISTATS (Japan, April 2019) and three papers will be presented at ICLR (USA, May 2019).
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
'Machine Learning Algorithm Able to Find Data Patterns a Human Could Not'
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.