Young Faculty Support Program (Group of Young Academic Professionals)
Category "New Researchers" (2018)
2nd year of study
Approved topic of thesis: Randomization in second-order optimization methods
Academic Supervisor: Vetrov, Dmitry
- Article Rodomanov A., Kropotov D. A Superlinearly-Convergent Proximal Newton-Type Method for the Optimization of Finite Sums // Journal of Machine Learning Research. 2016. Vol. 48. P. 2597-2605.
- Chapter Rodomanov A., Kropotov D. A Superlinearly-Convergent Proximal Newton-type Method for the Optimization of Finite Sums, in: Proceedings of Machine Learning Research. Proceedings of the International Conference on Machine Learning (ICML 2016) Vol. 48. NY : , 2016. P. 2597-2605.
- Chapter Dvurechensky P., Gasnikov A., Gasnikova E., Matsievsky S., Rodomanov A., Usik I. Primal-Dual Method for Searching Equilibrium in Hierarchical Congestion Population Games // В кн.: Proceedings of DOOR 2016 Conference, special issue of CEUR Workshop Proceedings Vol. 1623. CEUR Workshop Proceedings, 2016. С. 584-595.
In December 2016, five new international laboratories opened up at the Higher School of Economics, one of which was the International Laboratory of Deep Learning and Bayesian Methods. This lab focuses on combined neural Bayesian models that bring together two of the most successful paradigms in modern-day machine learning – the neural network paradigm and the Bayesian paradigm.