Dmitry Vetrov
- Research Professor:Faculty of Computer Science / Big Data and Information Retrieval School
- Head of the Centre:Faculty of Computer Science / Big Data and Information Retrieval School / Centre of Deep Learning and Bayesian Methods
- Member of the HSE Academic Council
- Dmitry Vetrov has been at HSE University since 2014.
Education and Degrees
- 2007
Candidate of Sciences* (PhD) in Discrete Mathematics and Mathematical Cybernetics
Lomonosov Moscow State University
Thesis Title: The relation of accuracy and stability of the classification algorithms - 2003
Degree
Lomonosov Moscow State University
According to the International Standard Classification of Education (ISCED) 2011, Candidate of Sciences belongs to ISCED level 8 - "doctoral or equivalent", together with PhD, DPhil, D.Lit, D.Sc, LL.D, Doctorate or similar. Candidate of Sciences allows its holders to reach the level of the Associate Professor.
Courses (2022/2023)
- Bayesian Methods for Machine Learning (Bachelor’s programme; Faculty of Computer Science; 4 year, 1, 2 module)Rus
- Bayesian Methods for Machine Learning (Master’s programme; Faculty of Computer Science; 2 year, 1, 2 module)Rus
- Past Courses
Courses (2021/2022)
- Bayesian Methods for Machine Learning (Bachelor’s programme; Faculty of Computer Science; 4 year, 1, 2 module)Rus
- Neural Bayesian Methods in Machine Learning (Master’s programme; Faculty of Computer Science; 1 year, 3, 4 module)Rus
Courses (2020/2021)
- Bayesian Methods for Machine Learning (Bachelor’s programme; Faculty of Computer Science; 4 year, 1, 2 module)Rus
- Neural Bayesian Methods in Machine Learning (Master’s programme; Faculty of Computer Science; 1 year, 3, 4 module)Rus
- Neurobayesian Models (Master’s programme; Faculty of Computer Science; 2 year, 3 module)Eng
Courses (2018/2019)
- Introductory Research Seminar (Bachelor’s programme; Faculty of Computer Science; 2 year, 3 module)Eng
- Neural Bayesian Methods in Machine Learning (Master’s programme; Faculty of Computer Science; 1 year, 3, 4 module)Rus
Courses (2017/2018)
Publications70
- Chapter Shchekotov I., Andreev P., Ivanov O., Alanov A., Vetrov D. FFC-SE: Fast Fourier Convolution for Speech Enhancement, in: InterSpeech 2022. International Speech Communication Association, 2022. doi P. 1188-1192.
- Chapter Alanov A., Titov V., Vetrov D. HyperDomainNet: Universal Domain Adaptation for Generative Adversarial Networks, in: Thirty-Sixth Conference on Neural Information Processing Systems : NeurIPS 2022. Curran Associates, Inc., 2022. P. 29414-29426.
- Chapter Kodryan M., Lobacheva E., Nakhodnov M., Vetrov D. Training Scale-Invariant Neural Networks on the Sphere Can Happen in Three Regimes, in: Thirty-Sixth Conference on Neural Information Processing Systems : NeurIPS 2022. Curran Associates, Inc., 2022. P. 14058-14070.
- Chapter Bobrov E., Markov A., Panchenko S., Vetrov D. Variational Autoencoders for Precoding Matrices with High Spectral Efficiency, in: Mathematical Optimization Theory and Operations Research: Recent Trends. 21st International Conference, MOTOR 2022, Petrozavodsk, Russia, July 2–6, 2022, Revised Selected Papers. Springer, 2022. Ch. 22. P. 315-326. doi
- Chapter Kirill Struminsky, Artyom Gadetsky, Denis Rakitin, Karpushkin D., Dmitry Vetrov. Leveraging Recursive Gumbel-Max Trick for Approximate Inference in Combinatorial Spaces, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 10999-11011. doi
- Chapter Lobacheva E., Kodryan M., Chirkova N., Malinin A., Vetrov D. On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 21545-21556.
- Chapter Кузнецов А. С., Shvechikov P., Grishin A., Vetrov D. Controlling Overestimation Bias with Truncated Mixture of Continuous Distributional Quantile Critics, in: International Conference on Machine Learning (ICML 2020) Vol. 119. PMLR, 2020. P. 5556-5566.
- Chapter Polykovskiy D., Vetrov D. Deterministic Decoding for Discrete Data in Variational Autoencoders, in: Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108 Issue 108. PMLR, 2020. P. 3046-3056.
- Chapter Molchanov D., Lyzhov A., Molchanova Y., Ashukha A., Vetrov D. Greedy Policy Search: A Simple Baseline for Learnable Test-Time Augmentation, in: Proceedings of Machine Learning Research, Volume 124: 36th Conference on Uncertainty in Artificial Intelligence (UAI), 2020 Vol. 124: 36th Conference on Uncertainty in Artificial Intelligence (UAI), 2020. , 2020. P. 1308-1317.
- Chapter Neklyudov K. O., Vetrov D., Welling M., Egorov E. Involutive MCMC: a Unifying Framework, in: International Conference on Machine Learning (ICML 2020) Vol. 119. PMLR, 2020. P. 7273-7282.
- Chapter Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-Variance Black-Box Gradient Estimates for the Plackett-Luce Distribution, in: Thirty-Fourth AAAI Conference on Artificial Intelligence Vol. 34. Palo Alto, California USA: AAAI Press, 2020. P. 10126-10135. doi
- Preprint Kodryan M., Kropotov D., Vetrov D. MARS: Masked Automatic Ranks Selection in Tensor Decompositions / First Workshop on Quantum Tensor Networks in Machine Learning, 34th Conference on Neural Information Processing Systems (NeurIPS 2020). Series QTNML 2020 "First Workshop on Quantum Tensor Networks in Machine Learning, NeurIPS 2020". 2020.
- Chapter Lobacheva E., Chirkova N., Kodryan M., Vetrov D. On Power Laws in Deep Ensembles, in: Advances in Neural Information Processing Systems 33 (NeurIPS 2020). Curran Associates, Inc., 2020. P. 2375-2385.
- Chapter Ashukha A., Molchanov D., Lyzhov A., Vetrov D. Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning, in: Proceedings of the 8th International Conference on Learning Representations (ICLR 2020). ICLR, 2020. P. 1-29.
- Chapter Lobacheva E., Chirkova N., Markovich A., Vetrov D. Structured Sparsification of Gated Recurrent Neural Networks, in: Thirty-Fourth AAAI Conference on Artificial Intelligence Vol. 34. Palo Alto, California USA: AAAI Press, 2020. Ch. 5938. P. 4989-4996. doi
- Chapter Alanov A., Kochurov M., Volkhonskiy D., Yashkov D., Burnaev E., Vetrov D. User-controllable Multi-texture Synthesis with Generative Adversarial Networks, in: Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISAPP 2020) Vol. 4. SciTePress, 2020. P. 214-221.
- Chapter Kuznetsov M., Polykovskiy D., Vetrov D., Zhebrak A. A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models, in: Advances in Neural Information Processing Systems 32 (NeurIPS 2019). , 2019. P. 1-11.
- Chapter Maddox W. J., Izmailov P., Garipov T., Vetrov D., Gordon Wilson A. A Simple Baseline for Bayesian Uncertainty in Deep Learning, in: Advances in Neural Information Processing Systems 32 (NeurIPS 2019). , 2019. P. 13153-13164.
- Article Struminsky K., Vetrov D. A Simple Method to Evaluate Support Size and Non-uniformity of a Decoder-Based Generative Model // Lecture Notes in Computer Science. 2019. Vol. 11832. P. 81-93. doi
- Chapter Molchanov D., Kharitonov V., Sobolev A., Vetrov D. Doubly Semi-Implicit Variational Inference, in: Proceedings of Machine Learning Research, Volume 89: The 22nd International Conference on Artificial Intelligence and Statistics (AISTATS 2019). PMLR, 2019. P. 2593-2602.
- Chapter Kodryan M., Grachev A., Ignatov D. I., Vetrov D. Efficient Language Modeling with Automatic Relevance Determination in Recurrent Neural Networks, in: Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) Issue W19-43. Florence, Italy: Association for Computational Linguistics, 2019. P. 40-48. doi
- Chapter Sobolev A., Vetrov D. Importance Weighted Hierarchical Variational Inference, in: Advances in Neural Information Processing Systems 32 (NeurIPS 2019). , 2019. P. 1-13.
- Preprint Gadetsky A., Struminsky K., Robinson C., Quadrianto N., Vetrov D. Low-variance Gradient Estimates for the Plackett-Luce Distribution / Bayesian Deep Learning NeurIPS 2019 Workshop. Series 2019 "Bayesian Deep Learning NeurIPS 2019 Workshop". 2019.
- Article Atanov A., Volokhova A., Ashukha A., Sosnovik I., Vetrov D. Semi-Conditional Normalizing Flows for Semi-Supervised Learning // Workshop on Invertible Neural Nets and Normalizing Flows, International Conference on Machine Learning. 2019. P. 1-9.
- Chapter Atanov A., Volokhova A., Ashukha A., Vetrov D. Semi-Conditional Normalizing Flows for Semi-Supervised Learning, in: First workshop on Invertible Neural Networks and Normalizing Flows (ICML 2019). INNF, 2019. P. 1-9.
- Chapter Lobacheva E., Chirkova N., Markovich A., Vetrov D. Structured Sparsification of Gated Recurrent Neural Networks, in: Workshop on Context and Compositionality in Biological and Artificial Neural Systems, Thirty-third Conference on Neural Information Processing Systems. Vancouver : , 2019. P. 1-4.
- Chapter Vetrov D., Izmailov P., Maddox W. J., Kirichenko P., Garipov T., Gordon Wilson A. Subspace Inference for Bayesian Deep Learning, in: Proceedings of the 35th Uncertainty in Artificial Intelligence Conference (UAI-2019). , 2019. P. 1-11.
- Chapter Atanov A., Ashukha A., Struminsky K., Vetrov D., Welling M. The Deep Weight Prior, in: Proceedings of the 7th International Conference on Learning Representations (ICLR 2019). ICLR, 2019. P. 1-17.
- Chapter Vetrov D., Neklyudov K. O., Egorov E. The Implicit Metropolis-Hastings Algorithm, in: Advances in Neural Information Processing Systems 32 (NeurIPS 2019). , 2019. P. 13954-13964.
- Chapter Neklyudov K. O., Molchanov D., Ashukha A., Vetrov D. Variance Networks: When Expectation Does Not Meet Your Expectations, in: Proceedings of the 7th International Conference on Learning Representations (ICLR 2019). ICLR, 2019. P. 1-16.
- Chapter Vetrov D., Ivanov O. Variational Autoencoder with Arbitrary Conditioning, in: Proceedings of the 7th International Conference on Learning Representations (ICLR 2019). ICLR, 2019. P. 1-25.
- Chapter Izmailov P., Garipov T., Подоприхин Д. А., Vetrov D., Gordon Wilson A. Averaging Weights Leads to Wider Optima and Better Generalization, in: Proceedings of the international conference on Uncertainty in Artificial Intelligence (UAI 2018). , 2018. P. 876-885.
- Chapter Chirkova N., Lobacheva E., Vetrov D. Bayesian Compression for Natural Language Processing, in: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2018. P. 2910-2915.
- Chapter Lobacheva E., Chirkova N., Vetrov D. Bayesian Sparsification of Gated Recurrent Neural Networks, in: Workshop on Compact Deep Neural Network Representation with Industrial Applications, Thirty-second Conference on Neural Information Processing Systems. Montréal : , 2018. P. 1-6.
- Chapter Gadetsky A., Yakubovskiy I., Vetrov D. Conditional Generators of Words Definitions, in: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics Vol. 2: Short Papers. Association for Computational Linguistics, 2018. P. 266-271.
- Preprint Molchanov D., Kharitonov V., Artem Sobolev, Vetrov D. Doubly Semi-Implicit Variational Inference / Cornell University. Series arxiv.org "stat.ML". 2018.
- Article Polykovskiy D., Zhebrak A., Vetrov D., Ivanenkov Y., Aladinskiy V., Mamoshina P., Bozdaganyan M., Aliper A., Zhavoronkov A., Kadurin A. Entangled Conditional Adversarial Autoencoder for de Novo Drug Discovery // Molecular Pharmaceutics. 2018. Vol. 15. No. 10. P. 4398-4405. doi
- Chapter Garipov T., Izmailov P., Подоприхин Д. А., Vetrov D., Gordon Wilson A. Loss Surfaces, Mode Connectivity, and Fast Ensembling of DNNs, in: Advances in Neural Information Processing Systems 31 (NIPS 2018). , 2018. P. 1-10.
- Article Spesivtsev P., Sinkov K., Sofronov I., Zimina A., Umnov A., Yarullin Ramil, Vetrov D. Predictive Model for the Bottomhole Pressure based on Machine Learning // Journal of Petroleum Science and Engineering. 2018. No. 166. P. 825-841. doi
- Article Figurnov M., Sobolev A., Vetrov D. Probabilistic adaptive computation time // Bulletin of the Polish Academy of Sciences: Technical Sciences. 2018. Vol. 66. No. 6. P. 811-820. doi
- Chapter Ashukha A., Vetrov D., Molchanov D., Neklyudov K. O., Atanov A. Uncertainty Estimation via Stochastic Batch Normalization, in: Workshop of the 6th International Conference on Learning Representations (ICLR). International Conference on Learning Representations, ICLR, 2018. P. 1-6.
- Preprint Kharitonov V., Molchanov D., Vetrov D. Variational Dropout via Empirical Bayes / Cornell University. Series arxiv.org "stat.ML". 2018.
- Chapter Lobacheva E., Chirkova N., Vetrov D. Bayesian Sparsification of Recurrent Neural Networks, in: 1st Workshop on Learning to Generate Natural Language, International Conference on Machine Learning. , 2017. P. 1-8.
- Chapter Figurnov M., Collins M. D., Zhu Y., Zhang L., Huang J., Vetrov D., Salakhutdinov R. Spatially Adaptive Computation Time for Residual Networks, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017). Curran Associates, Inc., 2017. P. 1790-1799. doi
- Chapter Neklyudov K. O., Molchanov D., Ashukha A., Vetrov D. Structured Bayesian Pruning via Log-Normal Multiplicative Noise, in: Advances in Neural Information Processing Systems 30 (NIPS 2017). Montreal : Curran Associates, 2017. P. 6776-6785.
- Chapter Molchanov D., Ashukha A., Vetrov D. Variational Dropout Sparsifies Deep Neural Networks, in: Proceedings of Machine Learning Research. Proceedings of the International Conference on Machine Learning (ICML 2017) Vol. 70. Sydney : , 2017. P. 2498-2507.
- Article Фигурнов М. В., Струминский К. А., Ветров Д. П. Устойчивый к шуму метод обучения вариационного автокодировщика // Интеллектуальные системы. Теория и приложения. 2017. Т. 21. № 2. С. 90-109.
- Chapter Struminsky K., Kruglik S., Vetrov D., Oseledets I. A new approach for sparse Bayesian channel estimation in SCMA uplink systems, in: 2016 8th International Conference on Wireless Communications and Signal Processing, WCSP 2016. October 13 - 15, Yangzhou, China. NY : Institute of Electrical and Electronic Engineers, 2016. P. 1-5. doi
- Article Bartunov S., Vetrov D., Kondrashkin D., Osokin A. Breaking Sticks and Ambiguities with Adaptive Skip-gram // Journal of Machine Learning Research. 2016. Vol. 51. P. 130-138.
- Chapter Bartunov S., Kondrashkin D., Osokin A., Vetrov D. Breaking Sticks and Ambiguities with Adaptive Skip-gram, in: Proceedings of Machine Learning Research. Proceedings of The International Conference on Artificial Intelligence and Statistics (AISTATS 2016) Vol. 51. Cadiz : , 2016. P. 130-138.
- Chapter Kirillov A., Gavrikov M., Lobacheva E., Osokin A., Vetrov D. Deep Part-Based Generative Shape Model with Latent Variables, in: Proceedings of the 27th British Machine Vision Conference. -, 2016. P. 1-12. doi
- Chapter Figurnov M., Ibraimova A., Vetrov D., Kohli P. PerforatedCNNs: Acceleration through Elimination of Redundant Convolutions, in: Advances in Neural Information Processing Systems 29 (NIPS 2016). NY : Curran Associates, 2016.
- Preprint Figurnov M., Struminsky K., Vetrov D. Robust Variational Inference / Cornell University. Series arXiv:1611.09226 "arxiv.org". 2016.
- Preprint Figurnov M., Collins M. D., Zhu Y., Zhang L., Huang J., Vetrov D., Salakhutdinov R. Spatially Adaptive Computation Time for Residual Networks / Cornell University. Series arXiv "arXiv:1612.02297". 2016.
- Preprint Bartunov S., Кондрашкин Д. А., Osokin A., Vetrov D. Breaking Sticks and Ambiguities with Adaptive Skip-gram / Arxiv.org. Series arXiv:1502.07257 "Computation and language". 2015.
- Article Vetrov D., Kohli P., Osokin A., Shapovalov R. V. Multi-utility Learning: Structured-Output Learning with Multiple Annotation-Specific Loss Functions // Lecture Notes in Computer Science. 2015. Vol. 8932. P. 406-420.
- Article Osokin A., Vetrov D. Submodular Relaxation for Inference in Markov Random Fields // IEEE Transactions on Pattern Analysis and Machine Intelligence. 2015. Vol. 37. No. 7. P. 1347-1359.
- Chapter Novikov A., Podoprikhin D., Osokin A., Vetrov D. Tensorizing Neural Networks, in: Advances in Neural Information Processing Systems 28 (NIPS 2015). NY : Curran Associates, 2015.
- Chapter Novikov A., Podoprikhin D., Osokin A., Vetrov D. Tensorizing neural networks, in: Advances in Neural Information Processing Systems 28 (NIPS 2015). NY : Curran Associates, 2015.
- Article Кириллов А. Н., Гавриков М. И., Лобачева Е. М., Осокин А. А., Ветров Д. П. Многоклассовая модель формы со скрытыми переменными // Интеллектуальные системы. Теория и приложения. 2015. Т. 19. № 2. С. 75-95.
- Article Vetrov D., Osokin A., Rodomanov A., Novikov A. Putting MRFs on a Tensor Train // Journal of Machine Learning Research. 2014
- Chapter Vetrov D., Osokin A., Novikov A., Rodomanov A. Putting MRFs on a Tensor Train, in: JMLR Workshop and Conference Proceedings Issue 32: Proceedings of The 31st International Conference on Machine Learning. Beijing : Microtome Publishing, 2014. P. 811-819.
- Chapter Bartunov S., Vetrov D. Variational Inference for Sequential Distance Dependent Chinese Restaurant Process, in: JMLR Workshop and Conference Proceedings Issue 32: Proceedings of The 31st International Conference on Machine Learning. Beijing : Microtome Publishing, 2014. P. 1404-1412.
- Article Bartunov S. O., Vetrov D. Variational Inference for Sequential Distance Dependent Chinese Restaurant Process. // Journal of Machine Learning Research. 2014. Vol. 32. No. 1. P. 1404-1412.
- Article Новиков А. В., Родоманов А. О., Осокин А. А., Ветров Д. П. Тензорный поезд в марковском случайном поле // Интеллектуальные системы. Теория и приложения. 2014. Т. 18. № 4. С. 293-318.
- Article Vetrov D., Voronin P. An Approach to Segmentation of Mouse Brain Images via Intermodal Registration // Pattern Recognition and Image Analysis. 2013. Vol. 23. No. 2. P. 335-339.
- Article Nekrasov K., Laptev D., Vetrov D. Automatic Determination of Cell Division Rate Using Microscope Images // Pattern Recognition and Image Analysis. 2013. Vol. 23. No. 1. P. 1-6.
- Article Yangel B. K., Vetrov D. Learning a Model for Shape-Constrained Image Segmentation from Weakly Labeled Data. // Lecture Notes in Computer Science. 2013. Vol. 8081. P. 137-150.
- Book Shapovalov R. V., Vetrov D., Kohli P. Proceedings of International Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2013.
- Chapter Osokin A., Vetrov D., Kolmogorov V. Submodular decomposition framework for inference in associative Markov networks with global constraints, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2011). Colorado Springs : IEEE, 2011. P. 1889-1896. doi
Employment history
2017-present, head of the Centre of Deep Learning and Bayesian Methods (National Research University Higher School of Economics, Faculty of Computer Sciences)
2018-2020, head of Samsung-HSE Laboratory (National Research University Higher School of Economics, Faculty of Computer Sciences)
2016-present, research professor, National Research University Higher School of Economics, Faculty of Computer Sciences
2016-2018, Yandex, leading researcher (part time)
2015-2016, Skoltech, associate professor
2014-2016, National Research University Higher School of Economics, Faculty of Computer Sciences, associate professor (part time)
2014-2015, Moscow State University, Faculty of Computational Mathematics and Cybernetics, associate professor 2011-2014, Moscow State University, Faculty of Computational Mathematics and Cybernetics, assistant professor 2010-2012, Kurchatov Institute, NBIC center, head of laboratory (part time)
2007-2011, Moscow State University, Faculty of Computational Mathematics and Cybernetics, researcher Summer 2005, 2006, University of Wales, Bangor, intern
2000-2007, Dorodnicyn Computer center of Russian Academy of Sciences, technical assistant (part time)
Three HSE Researchers Receive Ilya Segalovich Award
Three researchers of the HSE Faculty of Computer Science are among the winners of the 2022 Ilya Segalovich Award: Research Professor Dmitry Vetrov, Associate Professor Alexey Naumov and doctoral student Sergey Samsonov. The award, established by Yandex in 2019, is aimed at supporting young researchers and the scientific community in the field of IT in Russia, Belarus and Kazakhstan.
Fall into ML: Autumn School and Conference on Machine Learning Held at HSE University
On November 1st-3rd, 2022 the International Laboratory of Stochastic Algorithms and High-Dimensional Inference of the HSE Faculty of Computer Science and the Laboratory of Methods for Big Data Analysis with the support of HSE AI Centre and the Russian Science Foundation organized the first autumn school and conference on artificial intelligence ‘Fall into ML’. The new format of the event included a school for students and young researchers.
17 Articles by Researchers of HSE Faculty of Computer Science Accepted at NeurIPS
In 2022, 17 articles by the researchers of HSE Faculty of Computer Science were accepted at the NeurIPS (Conference and Workshop on Neural Information Processing Systems), one of the world’s most prestigious events in the field of machine learning and artificial intelligence. The 36th conference will be held in a hybrid format from November 28th to December 9th in New Orleans (USA).
Faculty Submits Ten Papers to NeurIPS 2021
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.
HSE University Receives Government Grant to Create AI Research Centre
A competition of research centres looking to receive grants through the Artificial Intelligence federal project has concluded, and HSE University is among the winners. Winning centres will focus on developing new AI technologies that expand its application, overcoming existing limitations for solving applied problems and optimizing AI models.
'When We Find Something That Doesn’t Work Well Enough, We Replace It in Order to Develop a More Effective Approach'
The winners of the third annual Ilya Segalovich Award were recently announced in Moscow. Established by Yandex, the award promotes the scientific endeavours of young researchers from Russia, Belarus, and Kazakhstan in the field of Computer Science. Among this year’s winners were three HSE students, including Alexander Grishin, a Doctoral student of the Big Data and Information Retrieval School of the HSE Faculty of Computer Science. Alexander spoke to us about his work, research challenges, and why he was surprised to receive the award.
Ilya Segalovich Scholarship Winners Interviews. Part 1
The winners of the Ilya Segalovich Scholarship were announced recently. We talked to them about the award, their studies and their projects.
Two HSE University Researchers Among Winners of Research Excellence Award Russia 2021
Elsevier (operated by Scopus) has awarded its prestigious Research Excellence Award since 2005. This year, it celebrates the Year of Science and Technology in Russia. On March 30, Elsevier awarded the winning scholars, which included Maxim Kotsemir, Research Fellow at the HSE Institute for Statistical Studies and Economics of Knowledge, and Dmitry Vetrov, Professor at the HSE Faculty of Computer Science.
NeurIPS — 2020 Accepts Three Articles from Faculty of Computer Science’s Researchers
34th conference on neural information processing systems NeurIPS 2020 is one of the largest conferences on machine learning in the world, taking place since 1989. It was going to take place in Vancouver, Canada on December 6-12, but takes place online.NeurIPS is as prestigious as ever, with 9,454 articles submitted and 1,300 articles accepted by the conference. Among those accepted are three articles by the researchers of the Faculty of Computer Science:
The paper "On Power Laws in Deep Ensembles" accepted as a spotlight to NeurIPS'20
Устный доклад сотрудников Лаборатории на одной из крупнейших конференций по ИИ.
The paper by researhers of the Center earned a bronze award at the largest research competition of Samsung
Сотрудники Центра получили бронзовую награду на конкурсе Samsung Best Paper Award 2020.
Dmitry Vetrov Becomes First Russian Scientist to Join ELLIS Society
ELLIS is a leading European association in the field of artificial intelligence
HSE Showcases Innovations in Urbanism and Neural Networks at Russia’s Geek Picnic 2020
This year, Russia’s largest science and technology festival, Geek Picnic, was held online for the first time. Despite the new format, the festival programme included all of the event’s usual key features: expert lectures, workshops, competitions, and opportunities to socialize and network with fellow tech and science enthusiasts. HSE University once again served as the festival’s content partner.
Graduate Talks: Polina Kirichenko
In 2018 Polina graduated from Faculty of Computer Science’s Applied Mathematics and Computer Science bachelor programme with honours. While studying, she worked for three years in Bayesian Methods Research Group. Last year Polina began PhD programme at the School of Operations Research and Information Engineering, Cornell University (USA).
The third Summer School on Deep Learning and Bayesian Methods was held in Moscow
В Москве вновь прошла Летняя школа по байесовским методам в глубинном обучении, собравшая участников из 27 стран.
First Cohort Graduates from Master’s Programme in Statistical Learning Theory
The Master's Programme in Statistical Learning Theory was launched in 2017. It is run jointly with the Skolkovo Institute of Science and Technology (Skoltech). The programme trains future scientists to effectively carry out fundamental research and work on new challenging problems in statistical learning theory, one of the most promising fields of science. Yury Kemaev and Maxim Kaledin, from the first cohort of programme graduates, sat down with HSE News Service to talk about their studies and plans for the future.
First Cohort Graduates from Master’s Programme in Statistical Learning Theory
The Master's Programme ‘Statistical Learning Theory’ was launched in 2017, and is run jointly with the Skolkovo Institute of Science and Technology(Skoltech).
Faculty of Computer Science Hosted 11 International Students for Research Internships This Spring
Students from France, UK, Italy, Switzerland, Togo, and Albania worked on research projects at the labs of the HSE Faculty of Computer Science during Spring Semester 2019.
Ilya Segalovich Scholarships Awarded on the Fifth Anniversary of HSE’s Faculty of Computer Science
As part of the HSE Faculty of Computer Science fifth anniversary celebration at Mercury Moscow City Tower, Ilya Segalovich Scholarships were awarded.
The faculty members will present their research on ICLR and AISTATS conferences
One paper will be presented at AISTATS (Japan, April 2019) and three papers will be presented at ICLR (USA, May 2019).
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
International Data Analysis Olympiad (IDAO) to Be Held for the Second Time
The HSE Faculty of Computer Science and Yandex with the support of Sberbank are to organize the 2nd International Data Analysis Olympiad (IDAO). The Olympiad is held by leading experts in data analysis for their future colleagues and aims to bring together analysts, scientists, professionals, and junior researchers.
DeepBayes 2018: More Bayesian Methods in Deep Learning
The second Summer School on Deep Learning and Bayesian Methods was held in Moscow from August 27 to September 1, this year in English. During 6 days participants were studying and implementing Bayesian methods in neural networks, exchanging their experience and discussing research ideas.
HSE Professor to Head Up Machine Learning Research at Samsung Centre for Artificial Intelligence
On May 29, Samsung opened its new Artificial Intelligence Centre in Moscow. Dmitry Vetrov, Professor of the HSE Faculty of Computer Science, will become one of its leaders and oversee research in machine learning.
HSE Student to Present at Association for Computational Linguistics Conference
An article by HSE Faculty of Computer Sciences senior Artyom Gadetsky was accepted for presentation at the Association for Computational Linguistics international conference, the only A* level conference on computational linguistics. According to the CORE system, which ranks large conferences in computer science, the category A* is the highest level for a conference.
HSE University Opens Joint Laboratory with Samsung Research
Samsung-HSE Laboratory will develop mechanisms of Bayesian inference in modern neural networks, which will solve a number of problems in deep learning. The laboratory team will be made up of the members of the Bayesian Methods Research Group — one of the strongest scientific groups in Russia in the field of machine learning and Bayesian inference. It will be headed by a professor of the Higher School of Economics Dmitry Vetrov.
How to Adjust a Smaller Size Neural Network without Quality Loss
Staff members of the HSE Faculty of Computer Science recently presented their papers at the biggest international conference on machine learning, Neural Information Processing Systems (NIPS)’.
How to Adjust a Smaller Size Neural Network without Quality Loss
Staff members of the HSE Faculty of Computer Science recently presented their papers at the biggest international conference on machine learning, Neural Information Processing Systems (NIPS)’.
First International Data Analysis Olympiad to Be Held in Russia
The IDAO (International Data Analysis Olympiad), created by leading experts in data analysis for their future colleagues, aims to bring together analysts, scientists, professionals, and junior researchers from all over the world on a single platform. This is the first time an event of this scale will be held in Russia. The HSE Faculty of Computer Science, Yandex and Harbour. Space University organize the Olympiad with the support of Sberbank.
HSE and University of London: Joint BA Programme in Applied Data Analysis
In 2018, the Higher School of Economics will launch an English-taught double degree programme in partnership with the University of London in Applied Data Analysis. Graduates will be awarded an undergraduate degree from HSE in Applied Mathematics and Information Science and a Bachelor of Science in Data Science and Business Analytics from the University of London. International applicants are invited to apply online starting November 15, 2017.
Second Summer School on Bayesian Methods in Deep Learning announced
During his talk on Sberbank Data Science Journey Dmitry Vetrov announced about second summer school on NeuroBayesian methods to be held in HSE in Aug 2018.
Dmitry Vetrov Took Part in the Samsung AI Forum
At October 19-20, head of the Deep Learning and Bayesian Methods laboratory Dmitry Vetrov took part in the forum on artificial intelligence at the headquarters of the corporation Samsung.
Pham Cong Thang – Pursuing a Post-Doc in Computer Science
On September 12, Pham Cong Thang, who goes by Thang, arrived in Moscow to begin a post-doctoral fellowship at HSE’s International Laboratory of Deep Learning and Bayesian Methods (Faculty of Computer Science, Big Data and Information Retrieval School). Working under the supervision of Professor Dmitry Vetrov, Thang will focus primarily on text and image processing, and computer vision.
Joint Projects in Big Data Discussed at HSE
The workshop on ‘Big Data and Applications’ was organized by HSE School of Business Informatics, Institut Mines-Télécom in association with CNRS MADICS with the support of the French Embassy in Russia, and the French Ministry of Higher Education, Research and Innovation.
Bayesian Methods in Deep Learning Summer School in Moscow
Bayesian Methods in Deep Learning Summer School was held in Moscow fron 26 to 30 August. During these five days 96 participants from 8 countries listened to lectures about Bayesian methods in deep learning and trained neural networks.
Faculty of Computer Science Staff Attend International Conference on Machine Learning
On August 6-11 the 34th International Conference on Machine Learning was held in Sydney, Australia. This conference is ranked A* by CORE, and is one of two leading conferences in the field of machine learning. It has been held annually since 2000, and this year, more than 1,000 participants from different countries took part.
A paper on CVPR 2017
Michael Fugurnov's paper written in collaboration with researchers from Google, Carnegie Mellon University and Dmitry Vetrov has been presented on Computer Vision and Pattern Recognition. The conference was held from 21 to 26 July in Honolulu, USA.
Machines Can See: International Summit on Computer Vision
On June 9, a summit was held on computer vision and deep learning ‘Machines can see’, organized by Sistema VC, Visionlabs, and the Strelka Institute. Dmitry Vetrov and Anton Konushin, staff members of HSE Faculty of Computer Science, were among the organizers of and speakers at the conference.
'Machine Learning Algorithm Able to Find Data Patterns a Human Could Not'
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.
Variational dropout sparsifies DNNs paper has been accepted to ICML'17
The paper authored by laboratory's research assistants Dmitry Molchanov and Arsenii Ashukha and head Dmitry Vetrov has been accepted to the International Conference on Machine Learning'2017. In this research a state-of-the-art result in deep neural networks sparsification was achieved using Bayesian framework applied to deep learning.
Collaboration with Samsung Opens New Perspectives for the Laboratory and the Faculty
Dmitry Vetrov, head of the laboratory, held a meeting with Mr. Shi-Hwa Lee, a Vice-President of Samsung, a company the laboratory collaborates with. Interim research results, internship possibilities and collaboration perspectives were discussed.
New International Laboratories Opening up at HSE
On December 23, 2016, the HSE Academic Council approved the creation of four new laboratories: the International Laboratory for the Study of Russian and European Intellectual Dialogue, the International Laboratory for Population and Health Studies, the International Laboratory of Deep Learning and Bayesian Methods, and the International Laboratory for Supercomputer Atomistic Modelling and Multi-scale Analysis.
‘Our Programme Aims to Make a Research Breakthrough at the Intersection of Mathematics and Computer Science’
In 2017, the HSE Faculty of Computer Science and Skoltech are opening admissions to the Master’s programme inStatistical Learning Theory, which will become the successor to theMathematical Methods of Optimization and Stochastics programme.Vladimir Spokoiny, the programme’s academic supervisor and professor of mathematics at Humboldt University in Berlin, told us about the research part of the new programme and the opportunities it offers to both Master’s students and undergraduate students alike.
Big Data: Prospects for Russian-French Cooperation in Science and Technology
Participants of the ‘Big Data Applications’ research workshop, which took place in the beginning of December, discussed big data and prospects for Russian-French cooperation in this area. The workshop, held at HSE, brought together about 50 participants from leading research centres, universities, governmental bodies and IT companies in both Russia and France.
Tensorizing Deep Neural Networks
The article ‘Tensorizing Neural Networks’, prepared by the Bayesian Methods Research Group under the supervision of Associate Professor of HSE’s Computer Science Faculty Dmitry Vetrov, has been accepted by the NIPS conference – the largest global forum on cognitive research, artificial intelligence, and machine learning, rated A* by the international CORE ranking. This year it is being held December 7-12 in Montreal. Here Dmitry Vetrov talks about the research he presented and about why delivering reports at conferences is better than getting published in the academic press.
Two Papers by Dmitry Vetrov Accepted at NIPS Conference
The Twenty-ninth Annual Conference on Neural Information Processing Systems (NIPS) is a single-track machine learning and computational neuroscience conference that includes invited talks, demonstrations and oral and poster presentations of refereed papers. All of the key breakthroughs in machine learning over the last 15 years were first presented at this conference. The conference is assigned to the highest category (A*) in the CORE Conference Ranking.
Reports by Ekateina Lobacheva and Dmitry Vetrov Accepted at ICCV 2015
The reports be Ekateina Lobacheva, doctoral student, and Dmitry Vetrov, Associate Professor of the Department Dmitry Vetrov, Associate Professor at the Big Data and Information Retrieval School were accepted the organisers of the International Conference on Computer Vision, which got the highest rank A* according to the international rating of IT conferences.