Max Ryabinin
- Teacher, Postgraduate Student:Faculty of Computer Science / Big Data and Information Retrieval School / Joint Department with Yandex
- Junior Research Fellow:Faculty of Computer Science / Big Data and Information Retrieval School / Yandex Laboratory
- Max Ryabinin has been at HSE University since 2017.
Postgraduate Studies
1st year of study
Approved topic of thesis: Methods and Problems of Decentralized Deep Learning
Academic Supervisor: Babenko, Artem
Courses (2021/2022)
- Deep Learning (Bachelor’s programme; Faculty of Computer Science; 4 year, 1, 2 module)Rus
- Efficient Deep Learning Systems (Bachelor’s programme; Faculty of Computer Science; 4 year, 3 module)Rus
- Past Courses
Courses (2019/2020)
Publications6
- Chapter Diskin M., Bukhtiyarov A., Ryabinin M., Saulnier L., Lhoest Q., Anton Sinitsin, Popov D., Pyrkin Dmitriy, Kashirin Maxim, Borzunov A., Villanova del Moral A., Mazur D., Kobelev I., Jernite Y., Wolf T., Pekhimenko G. Distributed Deep Learning In Open Collaborations, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 7879-7897.
- Chapter Tikhonov A., Ryabinin M. It’s All in the Heads: Using Attention Heads as a Baseline for Cross-Lingual Transfer in Commonsense Reasoning, in: Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Association for Computational Linguistics, 2021. P. 3534-3546. doi
- Chapter Ryabinin M., Gorbunov E., Plokhotnyuk V., Pekhimenko G. Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 18195-18211.
- Chapter Ryabinin M., Malinin A., Gales M. Scaling Ensemble Distribution Distillation to Many Classes with Proxy Targets, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 6023-6035.
- Chapter Ryabinin M., Sergei Popov, Liudmila Prokhorenkova, Voita E. Embedding Words in Non-Vector Space with Unsupervised Graph Learning, in: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. P. 7317-7331. doi
- Chapter Ryabinin M., Gusev A. Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts, in: Advances in Neural Information Processing Systems 33 (NeurIPS 2020). Curran Associates, Inc., 2020. P. 3659-3672.
Faculty Submits Ten Papers to NeurIPS 2021
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.
NeurIPS — 2020 Accepts Three Articles from Faculty of Computer Science’s Researchers
34th conference on neural information processing systems NeurIPS 2020 is one of the largest conferences on machine learning in the world, taking place since 1989. It was going to take place in Vancouver, Canada on December 6-12, but takes place online.NeurIPS is as prestigious as ever, with 9,454 articles submitted and 1,300 articles accepted by the conference. Among those accepted are three articles by the researchers of the Faculty of Computer Science: