Dmitry Kropotov
- Senior Lecturer:Faculty of Computer Science / Big Data and Information Retrieval School
- Dmitry Kropotov has been at HSE University since 2017.
Courses (2022/2023)
- Optimization in Machine Learning (Bachelor’s programme; Faculty of Computer Science; 3 year, 3, 4 module)Rus
- Past Courses
Courses (2021/2022)
Optimization in Machine Learning (Bachelor’s programme; Faculty of Computer Science; 3 year, 3, 4 module)Rus
Courses (2020/2021)
Optimization in Machine Learning (Bachelor’s programme; Faculty of Computer Science; 3 year, 3, 4 module)Rus
Publications7
- Article Bobrov E., Kropotov Dmitry, Lu H., Zaev D. Massive MIMO Adaptive Modulation and Coding Using Online Deep Learning Algorithm // IEEE Communications Letters. 2022. Vol. 26. No. 4. P. 818-822. doi
- Article Bobrov E., Kropotov D., Troshin S., Zaev D. Study on precoding optimization algorithms in massive MIMO system with multi-antenna users // Optimization Methods and Software. 2022. P. 1-16. doi
- Article Rodomanov A., Kropotov D. A randomized coordinate descent method with volume sampling // SIAM Journal on Optimization. 2020. Vol. 30. No. 3. P. 1878-1904. doi
- Preprint Kodryan M., Kropotov D., Vetrov D. MARS: Masked Automatic Ranks Selection in Tensor Decompositions / First Workshop on Quantum Tensor Networks in Machine Learning, 34th Conference on Neural Information Processing Systems (NeurIPS 2020). Series QTNML 2020 "First Workshop on Quantum Tensor Networks in Machine Learning, NeurIPS 2020". 2020.
- Chapter Izmailov P., Novikov A., Kropotov D. Scalable Gaussian Processes with Billions of Inducing Inputs via Tensor Train Decomposition, in: Proceedings of Machine Learning Research. Proceedings of The International Conference on Artificial Intelligence and Statistics (AISTATS 2018). , 2018. P. 726-735.
- Article Izmailov P., Kropotov D. Faster variational inducing input Gaussian process classification // Journal of machine learning and data analysis. 2017. Vol. 3. No. 1. P. 20-35. doi
- Chapter Rodomanov A., Kropotov D. A Superlinearly-Convergent Proximal Newton-type Method for the Optimization of Finite Sums, in: Proceedings of Machine Learning Research. Proceedings of the International Conference on Machine Learning (ICML 2016) Vol. 48. NY : , 2016. P. 2597-2605.
Conferences
- 2018
The 21st International Conference on Artificial Intelligence and Statistics (AISTATS 2018) (Плайя-Бланка, Ланзароте, Канарские острова). Presentation: Scalable Gaussian Processes with Billions of Inducing Inputs via Tensor Train Decomposition
- 2016
The 33rd International Conference on Machine Learning (ICML 2016) (Нью-Йорк). Presentation: A Superlinearly-Convergent Proximal Newton-Type Method for the Optimization of Finite Sums