- Research Professor:Faculty of Computer Science / Big Data and Information Retrieval School
- Laboratory Head:Faculty of Computer Science / Big Data and Information Retrieval School / International Laboratory of Deep Learning and Bayesian Methods
- Dmitry Vetrov has been at HSE since 2014.
Education and Degrees
Candidate of Sciences* (PhD) in Discrete Mathematics and Mathematical Cybernetics
Lomonosov Moscow State University
Thesis Title: The relation of accuracy and stability of the classification algorithms
Lomonosov Moscow State University
According to the International Standard Classification of Education (ISCED) 2011, Candidate of Sciences belongs to ISCED level 8 - "doctoral or equivalent", together with PhD, DPhil, D.Lit, D.Sc, LL.D, Doctorate or similar. Candidate of Sciences allows its holders to reach the level of the Associate Professor.
- Article Bartunov S., Vetrov D., Kondrashkin D., Osokin A. Breaking Sticks and Ambiguities with Adaptive Skip-gram // Journal of Machine Learning Research. 2016. Vol. 51. P. 130-138.
- Chapter Kirillov A., Gavrikov M., Lobacheva E., Osokin A., Vetrov D. Deep Part-Based Generative Shape Model with Latent Variables, in: Proceedings of the 27th British Machine Vision Conference. BMVA Press, 2016.
- Chapter Figurnov M., Ibraimova A., Vetrov D., Kohli P. PerforatedCNNs: Acceleration through Elimination of Redundant Convolutions, in: Advances in Neural Information Processing Systems 29 (NIPS 2016). NY : Curran Associates, 2016.
- Preprint Figurnov M., Collins M. D., Zhu Y., Zhang L., Huang J., Vetrov D., Salakhutdinov R. Spatially Adaptive Computation Time for Residual Networks / Cornell University. Series arXiv "arXiv:1612.02297". 2016.
- Preprint Bartunov S., Кондрашкин Д. А., Osokin A., Vetrov D. Breaking Sticks and Ambiguities with Adaptive Skip-gram / Arxiv.org. Series arXiv:1502.07257 "Computation and language". 2015.
- Article Vetrov D., Kohli P., Osokin A., Shapovalov R. V. Multi-utility Learning: Structured-Output Learning with Multiple Annotation-Specific Loss Functions // Lecture Notes in Computer Science. 2015. Vol. 8932. P. 406-420.
- Article Osokin A., Vetrov D. Submodular Relaxation for Inference in Markov Random Fields // IEEE Transactions on Pattern Analysis and Machine Intelligence. 2015. Vol. 37. No. 7. P. 1347-1359.
- Chapter Novikov A., Podoprikhin D., Osokin A., Vetrov D. Tensorizing neural networks, in: Advances in Neural Information Processing Systems 28 (NIPS 2015). NY : Curran Associates, 2015.
- Article Кириллов А. Н., Гавриков М. И., Лобачева Е. М., Осокин А. А., Ветров Д. П. Многоклассовая модель формы со скрытыми переменными // Интеллектуальные системы. Теория и приложения. 2015. Т. 19. № 2. С. 75-95.
- Article Vetrov D., Osokin A., Rodomanov A., Novikov A. Putting MRFs on a Tensor Train // Journal of Machine Learning Research. 2014
- Chapter Bartunov S., Vetrov D. Variational Inference for Sequential Distance Dependent Chinese Restaurant Process, in: JMLR Workshop and Conference Proceedings Issue 32: Proceedings of The 31st International Conference on Machine Learning. Beijing : Microtome Publishing, 2014. P. 1404-1412.
- Article Bartunov S. O., Vetrov D. Variational Inference for Sequential Distance Dependent Chinese Restaurant Process. // Journal of Machine Learning Research. 2014. Vol. 32. No. 1. P. 1404-1412.
- Article Vetrov D., Voronin P. An Approach to Segmentation of Mouse Brain Images via Intermodal Registration // Pattern Recognition and Image Analysis. (Advances in Mathematical Theory and Applications). 2013. Vol. 23. No. 2. P. 335-339.
- Article Nekrasov K., Laptev D., Vetrov D. Automatic Determination of Cell Division Rate Using Microscope Images // Pattern Recognition and Image Analysis. (Advances in Mathematical Theory and Applications). 2013. Vol. 23. No. 1. P. 1-6.
- Article Yangel B. K., Vetrov D. Learning a Model for Shape-Constrained Image Segmentation from Weakly Labeled Data. // Lecture Notes in Computer Science. 2013. Vol. 8081. P. 137-150.
- Book Shapovalov R. V., Vetrov D., Kohli P. Proceedings of International Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2013.
On December 23, 2016, the HSE Academic Council approved the creation of four new laboratories: the International Laboratory for the Study of Russian and European Intellectual Dialogue, the International Laboratory for Population and Health Studies, the International Laboratory of Deep Learning and Bayesian Methods, and the International Laboratory for Supercomputer Atomistic Modelling and Multi-scale Analysis.
‘Our Programme Aims to Make a Research Breakthrough at the Intersection of Mathematics and Computer Science’
In 2017, the HSE Faculty of Computer Science and Skoltech are opening admissions to the Master’s programme inStatistical Learning Theory, which will become the successor to theMathematical Methods of Optimization and Stochastics programme.Vladimir Spokoiny, the programme’s academic supervisor and professor of mathematics at Humboldt University in Berlin, told us about the research part of the new programme and the opportunities it offers to both Master’s students and undergraduate students alike.
Participants of the ‘Big Data Applications’ research workshop, which took place in the beginning of December, discussed big data and prospects for Russian-French cooperation in this area. The workshop, held at HSE, brought together about 50 participants from leading research centres, universities, governmental bodies and IT companies in both Russia and France.
The article ‘Tensorizing Neural Networks’, prepared by the Bayesian Methods Research Group under the supervision of Associate Professor of HSE’s Computer Science Faculty Dmitry Vetrov, has been accepted by the NIPS conference – the largest global forum on cognitive research, artificial intelligence, and machine learning, rated A* by the international CORE ranking. This year it is being held December 7-12 in Montreal. Here Dmitry Vetrov talks about the research he presented and about why delivering reports at conferences is better than getting published in the academic press.
The Twenty-ninth Annual Conference on Neural Information Processing Systems (NIPS) is a single-track machine learning and computational neuroscience conference that includes invited talks, demonstrations and oral and poster presentations of refereed papers. All of the key breakthroughs in machine learning over the last 15 years were first presented at this conference. The conference is assigned to the highest category (A*) in the CORE Conference Ranking.
The reports be Ekateina Lobacheva, doctoral student, and Dmitry Vetrov, Associate Professor of the Department Dmitry Vetrov, Associate Professor at the Big Data and Information Retrieval School were accepted the organisers of the International Conference on Computer Vision, which got the highest rank A* according to the international rating of IT conferences.