- Research Fellow, Deputy Head of the Centre:Faculty of Computer Science / AI and Digital Science Institute / Centre of Deep Learning and Bayesian Methods
- Kirill Struminsky has been at HSE University since 2016.
To conduct research in the area of deep generative modeling and related areas of deep learning.
- Research Seminar "Machine Learning and Applications" (Bachelor’s programme; Faculty of Computer Science; 3 year, 1-4 module)Rus
- Research Seminar "Machine Learning and Applications 2" (Bachelor’s programme; Faculty of Computer Science; 4 year, 1-3 module)Rus
- Chapter Morozov N., Rakitin D., Oleg Desheulin, Vetrov D., Struminsky K. Differentiable Rendering with Reparameterized Volume Sampling, in: Neural Fields across Fields: Methods and Applications of Implicit Neural Representations. ICLR 2023 Workshop. , 2023. Ch. 8.
- Chapter Kirill Struminsky, Artyom Gadetsky, Denis Rakitin, Karpushkin D., Dmitry Vetrov. Leveraging Recursive Gumbel-Max Trick for Approximate Inference in Combinatorial Spaces, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 10999-11011. doi
- Chapter Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-Variance Black-Box Gradient Estimates for the Plackett-Luce Distribution, in: Thirty-Fourth AAAI Conference on Artificial Intelligence Vol. 34. Palo Alto, California USA: AAAI Press, 2020. P. 10126-10135. doi
- Article Struminsky K., Vetrov D. A Simple Method to Evaluate Support Size and Non-uniformity of a Decoder-Based Generative Model // Lecture Notes in Computer Science. 2019. Vol. 11832. P. 81-93. doi
- Preprint Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-variance Gradient Estimates for the Plackett-Luce Distribution / Bayesian Deep Learning NeurIPS 2019 Workshop. Series 2019 "Bayesian Deep Learning NeurIPS 2019 Workshop". 2019.
- Chapter Struminsky K., Lacoste-Julien S., Osokin A. Quantifying Learning Guarantees for Convex but Inconsistent Surrogates, in: Advances in Neural Information Processing Systems 31 (NIPS 2018). , 2018. P. 1-9.
- Article Фигурнов М. В., Струминский К. А., Ветров Д. П. Устойчивый к шуму метод обучения вариационного автокодировщика // Интеллектуальные системы. Теория и приложения. 2017. Т. 21. № 2. С. 90-109.
- Chapter Struminsky K., Kruglik S., Vetrov D., Oseledets I. A new approach for sparse Bayesian channel estimation in SCMA uplink systems, in: 2016 8th International Conference on Wireless Communications and Signal Processing, WCSP 2016. October 13 - 15, Yangzhou, China. NY : Institute of Electrical and Electronic Engineers, 2016. P. 1-5. doi
In September, HSE University announced the results of a competition of digital projects by early-career HSE scientists. The event was organised within the framework of the strategic project ‘Digital Transformation: Technologies, Effects, Efficiency’. The organisers selected 8 out of 22 applications. The research teams have already started to implement their projects, and the results will be presented at the end of November. The HSE News Service shares the details of three of the highest-scoring projects in the competition. The creators of the projects are staff members of the HSE Center for Language and Brain, MIEM, and the Faculty of Computer Science.
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.
One paper will be presented at AISTATS (Japan, April 2019) and three papers will be presented at ICLR (USA, May 2019).
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.