Kirill Struminsky
- Research Fellow, Deputy Head of the Centre:Faculty of Computer Science / AI and Digital Science Institute / Centre of Deep Learning and Bayesian Methods
- Kirill Struminsky has been at HSE University since 2016.
Responsibilities
To conduct research in the area of deep generative modeling and related areas of deep learning.
Courses (2021/2022)
- Research Seminar "Machine Learning and Applications" (Bachelor’s programme; Faculty of Computer Science; 3 year, 1-4 module)Rus
- Research Seminar "Machine Learning and Applications 2" (Bachelor’s programme; Faculty of Computer Science; 4 year, 1-3 module)Rus
- Past Courses
Courses (2020/2021)
Courses (2019/2020)
Courses (2018/2019)
Publications10
- Chapter Morozov N., Rakitin D., Oleg Desheulin, Vetrov D., Struminsky K. Differentiable Rendering with Reparameterized Volume Sampling, in: Neural Fields across Fields: Methods and Applications of Implicit Neural Representations. ICLR 2023 Workshop. , 2023. Ch. 8.
- Chapter Kirill Struminsky, Artyom Gadetsky, Denis Rakitin, Karpushkin D., Dmitry Vetrov. Leveraging Recursive Gumbel-Max Trick for Approximate Inference in Combinatorial Spaces, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 10999-11011. doi
- Chapter Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-Variance Black-Box Gradient Estimates for the Plackett-Luce Distribution, in: Thirty-Fourth AAAI Conference on Artificial Intelligence Vol. 34. Palo Alto, California USA: AAAI Press, 2020. P. 10126-10135. doi
- Article Struminsky K., Vetrov D. A Simple Method to Evaluate Support Size and Non-uniformity of a Decoder-Based Generative Model // Lecture Notes in Computer Science. 2019. Vol. 11832. P. 81-93. doi
- Preprint Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-variance Gradient Estimates for the Plackett-Luce Distribution / Bayesian Deep Learning NeurIPS 2019 Workshop. Series 2019 "Bayesian Deep Learning NeurIPS 2019 Workshop". 2019.
- Chapter Atanov A., Ashukha A., Struminsky K., Vetrov D., Welling M. The Deep Weight Prior, in: Proceedings of the 7th International Conference on Learning Representations (ICLR 2019). ICLR, 2019. P. 1-17.
- Chapter Struminsky K., Lacoste-Julien S., Osokin A. Quantifying Learning Guarantees for Convex but Inconsistent Surrogates, in: Advances in Neural Information Processing Systems 31 (NIPS 2018). , 2018. P. 1-9.
- Article Фигурнов М. В., Струминский К. А., Ветров Д. П. Устойчивый к шуму метод обучения вариационного автокодировщика // Интеллектуальные системы. Теория и приложения. 2017. Т. 21. № 2. С. 90-109.
- Chapter Struminsky K., Kruglik S., Vetrov D., Oseledets I. A new approach for sparse Bayesian channel estimation in SCMA uplink systems, in: 2016 8th International Conference on Wireless Communications and Signal Processing, WCSP 2016. October 13 - 15, Yangzhou, China. NY : Institute of Electrical and Electronic Engineers, 2016. P. 1-5. doi
- Preprint Figurnov M., Struminsky K., Vetrov D. Robust Variational Inference / Cornell University. Series arXiv:1611.09226 "arxiv.org". 2016.
‘The Competition Gave Young Researchers an Opportunity to Take the Initiative’
In September, HSE University announced the results of a competition of digital projects by early-career HSE scientists. The event was organised within the framework of the strategic project ‘Digital Transformation: Technologies, Effects, Efficiency’. The organisers selected 8 out of 22 applications. The research teams have already started to implement their projects, and the results will be presented at the end of November. The HSE News Service shares the details of three of the highest-scoring projects in the competition. The creators of the projects are staff members of the HSE Center for Language and Brain, MIEM, and the Faculty of Computer Science.
Faculty Submits Ten Papers to NeurIPS 2021
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.
The faculty members will present their research on ICLR and AISTATS conferences
One paper will be presented at AISTATS (Japan, April 2019) and three papers will be presented at ICLR (USA, May 2019).
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
'Machine Learning Algorithm Able to Find Data Patterns a Human Could Not'
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.