Maxim Kodryan
- Junior Research Fellow:Faculty of Computer Science / AI and Digital Science Institute / Centre of Deep Learning and Bayesian Methods
- Maxim Kodryan has been at HSE University since 2019.
Education
- 2020
Master's
Lomonosov Moscow State University - 2018
Bachelor in Applied Mathematics and Information Science
Lomonosov Moscow State University, CMC
Courses (2022/2023)
- Optimization in Machine Learning (Bachelor’s programme; Faculty of Computer Science; 3 year, 3, 4 module)Rus
- Research Seminar "Machine Learning and Applications" (Bachelor’s programme; Faculty of Computer Science; 3 year, 1-4 module)Rus
- Past Courses
Courses (2021/2022)
Courses (2020/2021)
Publications7
- Chapter Nakhodnov M., Kodryan M., Lobacheva E., Vetrov D. Loss function dynamics and landscape for deep neural networks trained with quadratic loss, in: Doklady Mathematics Vol. 106. Issue 1: Supplement. Pleiades Publishing, Ltd. (Плеадес Паблишинг, Лтд), 2023. P. 43-62.
- Chapter Kodryan M., Kropotov D., Vetrov D. MARS: Masked Automatic Ranks Selection in Tensor Decompositions, in: Proceedings of The 26th International Conference on Artificial Intelligence and Statistics (AISTATS 2023), Volume 206 Vol. 206. Valencia : PMLR, 2023. P. 3718-3732.
- Chapter Kodryan M., Lobacheva E., Nakhodnov M., Vetrov D. Training Scale-Invariant Neural Networks on the Sphere Can Happen in Three Regimes, in: Thirty-Sixth Conference on Neural Information Processing Systems : NeurIPS 2022. Curran Associates, Inc., 2022. P. 14058-14070.
- Chapter Lobacheva E., Kodryan M., Chirkova N., Malinin A., Vetrov D. On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay, in: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 21545-21556.
- Preprint Kodryan M., Kropotov D., Vetrov D. MARS: Masked Automatic Ranks Selection in Tensor Decompositions / First Workshop on Quantum Tensor Networks in Machine Learning, 34th Conference on Neural Information Processing Systems (NeurIPS 2020). Series QTNML 2020 "First Workshop on Quantum Tensor Networks in Machine Learning, NeurIPS 2020". 2020.
- Chapter Lobacheva E., Chirkova N., Kodryan M., Vetrov D. On Power Laws in Deep Ensembles, in: Advances in Neural Information Processing Systems 33 (NeurIPS 2020). Curran Associates, Inc., 2020. P. 2375-2385.
- Chapter Kodryan M., Grachev A., Ignatov D. I., Vetrov D. Efficient Language Modeling with Automatic Relevance Determination in Recurrent Neural Networks, in: Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) Issue W19-43. Florence, Italy: Association for Computational Linguistics, 2019. P. 40-48. doi
Employment history
January 2021 — Present Centre of Deep Learning and Bayesian Methods, HSE, Moscow
August 2019 — December 2020 Samsung-HSE Joint Lab, Moscow
June 2017 — March 2019 Samsung R&D Institute Russia, Moscow
Faculty Submits Ten Papers to NeurIPS 2021
35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.
NeurIPS — 2020 Accepts Three Articles from Faculty of Computer Science’s Researchers
34th conference on neural information processing systems NeurIPS 2020 is one of the largest conferences on machine learning in the world, taking place since 1989. It was going to take place in Vancouver, Canada on December 6-12, but takes place online.NeurIPS is as prestigious as ever, with 9,454 articles submitted and 1,300 articles accepted by the conference. Among those accepted are three articles by the researchers of the Faculty of Computer Science:
The paper "On Power Laws in Deep Ensembles" accepted as a spotlight to NeurIPS'20
Устный доклад сотрудников Лаборатории на одной из крупнейших конференций по ИИ.