- Research Fellow:Faculty of Computer Science / Big Data and Information Retrieval School / Samsung-HSE Laboratory
- Visiting Scholar:Faculty of Computer Science / Big Data and Information Retrieval School
- Nadezhda Chirkova has been at HSE University since 2016.
Bachelor, Master in Applied Mathematics and Information Science
Lomonosov Moscow State University
Awards and Accomplishments
Young Faculty Support Program (Group of Young Academic Professionals)
Category "New Researchers" (2018)
- Bayesian Methods for Data Analysis (Master’s programme; Faculty of Computer Science; 2 year, 1, 2 module)Rus
- Introduction to Data Analysis (Minor; Faculty of Computer Science; 3, 4 module)Rus
- Machine Learning (Bachelor’s programme; Faculty of Economic Sciences; 4 year, 1, 2 module)Rus
- Machine Learning (Bachelor’s programme; Faculty of Economic Sciences; 3 year, 1, 2 module)Rus
- Chapter Lobacheva E., Chirkova N., Markovich A., Vetrov D. Structured Sparsification of Gated Recurrent Neural Networks, in: Workshop on Context and Compositionality in Biological and Artificial Neural Systems, Thirty-third Conference on Neural Information Processing Systems. Vancouver : , 2019. P. 1-4.
- Chapter Chirkova N., Lobacheva E., Vetrov D. Bayesian Compression for Natural Language Processing, in: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2018. P. 2910-2915.
- Chapter Lobacheva E., Chirkova N., Vetrov D. Bayesian Sparsification of Gated Recurrent Neural Networks, in: Workshop on Compact Deep Neural Network Representation with Industrial Applications, Thirty-second Conference on Neural Information Processing Systems. Montréal : , 2018. P. 1-6.
- Chapter Lobacheva E., Chirkova N., Vetrov D. Bayesian Sparsification of Recurrent Neural Networks, in: 1st Workshop on Learning to Generate Natural Language, International Conference on Machine Learning. , 2017. P. 1-8.
- Article N. A. Chirkova, K. V. Vorontsov. Additive Regularization for Hierarchical Multimodal Topic Modeling // Journal of machine learning and data analysis. 2016. Vol. 2. No. 2. P. 187-200. doi
Junior machine learning researcher, Antiplagiat JSC, 2016 July→2016 August. Developing a prototype of domain specific search system that incorporates hierarchical topic structure learned from the domain data.
Machine Learning Teacher Assistant, Coursera machine learning specialization, 2016 January→2017 March. Developing practical assignments for the students explaining how machine learning algorithms work.
The faculty presented the results of their research at the largest international machine learning conference NeurIPS
Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
On August 6-11 the 34th International Conference on Machine Learning was held in Sydney, Australia. This conference is ranked A* by CORE, and is one of two leading conferences in the field of machine learning. It has been held annually since 2000, and this year, more than 1,000 participants from different countries took part.
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.