Nadezhda Chirkova
- Visiting Lecturer:Faculty of Computer Science / Big Data and Information Retrieval School
- Senior Research Fellow, Research Fellow:Faculty of Computer Science / Big Data and Information Retrieval School / Centre of Deep Learning and Bayesian Methods
- Nadezhda Chirkova has been at HSE University since 2016.
Education
Bachelor, Master in Applied Mathematics and Information Science
Lomonosov Moscow State University

Young Faculty Support Program (Group of Young Academic Professionals)
Category "New Researchers" (2018)
Courses (2020/2021)
- Bayesian Methods for Data Analysis (Master’s programme; Faculty of Computer Science; 2 year, 1 module)Rus
- Introduction to Data Analysis (Minor; Faculty of Computer Science; 3, 4 module)Rus
- Machine Learning (Bachelor’s programme; Faculty of Economic Sciences; 4 year, 1, 2 module)Rus
- Machine Learning (Bachelor’s programme; Faculty of Economic Sciences; 3 year, 1, 2 module)Rus
- Machine Learning (Bachelor’s programme; Faculty of Economic Sciences; 2 year, 1, 2 module)Rus
- Past Courses
Courses (2019/2020)
- Bayesian Methods for Data Analysis (Master’s programme; Faculty of Computer Science; 2 year, 1, 2 module)Rus
- Introduction to Data Analysis (Minor; Faculty of Computer Science; 3, 4 module)Rus
- Machine Learning (Bachelor’s programme; Faculty of Economic Sciences; 4 year, 1, 2 module)Rus
- Machine Learning (Bachelor’s programme; Faculty of Economic Sciences; 3 year, 1, 2 module)Rus
Courses (2018/2019)
- Introduction to Data Analysis (Minor; Faculty of Computer Science; 3, 4 module)Rus
- Machine Learning (Master’s programme; Faculty of Computer Science; 1 year, 1, 2 module)Rus
Publications7
- Chapter Lobacheva E., Chirkova N., Kodryan M., Vetrov D. On Power Laws in Deep Ensembles, in: Advances in Neural Information Processing Systems 33 (NeurIPS 2020). Curran Associates, Inc., 2020. P. 2375-2385.
- Chapter Lobacheva E., Chirkova N., Markovich A., Vetrov D. Structured Sparsification of Gated Recurrent Neural Networks, in: Thirty-Fourth AAAI Conference on Artificial Intelligence Vol. 34. Palo Alto, California USA: AAAI Press, 2020. Ch. 5938. P. 4989-4996. doi
- Chapter Lobacheva E., Chirkova N., Markovich A., Vetrov D. Structured Sparsification of Gated Recurrent Neural Networks, in: Workshop on Context and Compositionality in Biological and Artificial Neural Systems, Thirty-third Conference on Neural Information Processing Systems. Vancouver : , 2019. P. 1-4.
- Chapter Chirkova N., Lobacheva E., Vetrov D. Bayesian Compression for Natural Language Processing, in: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2018. P. 2910-2915.
- Chapter Lobacheva E., Chirkova N., Vetrov D. Bayesian Sparsification of Gated Recurrent Neural Networks, in: Workshop on Compact Deep Neural Network Representation with Industrial Applications, Thirty-second Conference on Neural Information Processing Systems. Montréal : , 2018. P. 1-6.
- Chapter Lobacheva E., Chirkova N., Vetrov D. Bayesian Sparsification of Recurrent Neural Networks, in: 1st Workshop on Learning to Generate Natural Language, International Conference on Machine Learning. , 2017. P. 1-8.
- Article N. A. Chirkova, K. V. Vorontsov. Additive Regularization for Hierarchical Multimodal Topic Modeling // Journal of machine learning and data analysis. 2016. Vol. 2. No. 2. P. 187-200. doi
Employment history
Junior machine learning researcher, Antiplagiat JSC, 2016 July→2016 August. Developing a prototype of domain specific search system that incorporates hierarchical topic structure learned from the domain data.
Machine Learning Teacher Assistant, Coursera machine learning specialization, 2016 January→2017 March. Developing practical assignments for the students explaining how machine learning algorithms work.
NeurIPS — 2020 Accepts Three Articles from Faculty of Computer Science’s Researchers
34th conference on neural information processing systems NeurIPS 2020 is one of the largest conferences on machine learning in the world, taking place since 1989. It was going to take place in Vancouver, Canada on December 6-12, but takes place online.NeurIPS is as prestigious as ever, with 9,454 articles submitted and 1,300 articles accepted by the conference. Among those accepted are three articles by the researchers of the Faculty of Computer Science:
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
Faculty of Computer Science Staff Attend International Conference on Machine Learning
On August 6-11 the 34th International Conference on Machine Learning was held in Sydney, Australia. This conference is ranked A* by CORE, and is one of two leading conferences in the field of machine learning. It has been held annually since 2000, and this year, more than 1,000 participants from different countries took part.
'Machine Learning Algorithm Able to Find Data Patterns a Human Could Not'
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.