• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Applied Machine Learning

Academic Year
Instruction in English
ECTS credits
Delivered at:
Department of Innovation and Business in Information Technologies
Course type:
Elective course
1 year, 3, 4 module


Lisitsyn, Sergey

Course Syllabus


Machine learning is the field of study that helps us to find the dependencies in data automatically. Such a technology enables to solve different problems without explicit programming of rules. Due to advances in computing and the field itself, during last decade machine learning has become an essential feature of products ranging from web-services to banks. In this course the student is going to overview the essential concepts of machine learning and then practice employing machine learning methods to solve business tasks. This course emphasizes the practical part and considers various aspects of solving real-world problems. The course content covers all the popular methods such as linear methods, gradient boosting, and neural networks. Finally, the course considers the best practices of major companies leveraging the machine learning technology.
Learning Objectives

Learning Objectives

  • Learn to identify a machine learning problem to solve a business problem
  • Practice fitting models to solve essential machine learning problems such as regression and classification
  • Learn to design and to develop machine learning systems
  • Learn to re-use pre-trained models to lower the development cost of a machine learning systems
Expected Learning Outcomes

Expected Learning Outcomes

  • Can identify a problem suitable for machine learning
  • Knows at least a few modern applications of machine learning
  • Able to identify classification, regression, and clustering problems
  • Knows the limitations of linear models
  • Able to fit a logistic regression model on a given dataset
  • Able to fit and interpret a decision tree model on a given dataset
  • Knows the essential rules to develop and support machine learning systems
  • Able to use pre-trained models
  • Understands the idea of convolution as the base operation for images and audio data
  • Able to train a neural network given a dataset
  • Understands the concept of differentiable programming
  • Able to identify the suitable metric for a machine learning system
  • Can fit a clustering model given a dataset
  • Able to identify a clustering problem
  • Understands the concept of non-parametric learning
  • Understands the concept of embeddings
  • Understands the essential methods for recommenders: collaborative filtering, content-based, and matrix factorization
  • Can identify a recommender problem
  • Understands the universality of gradient boosting approach
  • Understands the boosting approach to create an ensemble of models
  • Knows the relations between complexity and overfitting
  • Able to identify overfitting
  • Able to apply gradient boosting approach to solve classification and regression problems
Course Contents

Course Contents

  • Scope of machine learning
    Reasons to use machine learning and the limitations of current technologies. An overview of applications of machine learning.
  • Machine learning problems
    Classification, regression, and clustering.
  • Linear models for regression and classification
    The formal definition. The logistic regression model. Loss functions and regularizers.
  • Decision trees and ensembles
    Decision tree models. Entropy and Gini as measures of information for a split. Learning algorithms: ID3 and CART.
  • Overfitting
    The overfitting effect and it's causes. Ways to solve overftting. Ensembles of various models.
  • Boosting and gradient boosting
    The approach of boosting to increase the complexity of models. Overfitting and boosting. Relations of boosting and gradient-based methods. The gradient boosting approach.
  • Recommender systems and embeddings
    Collaborative filtering and content-based recommenders. The matrix factorization approach. Relations between matrix factorization and embeddings. Embeddings as the modern way to solve recommendation problems.
  • Non-parametric methods for classification and regression
    The k-nearest neighbor algorithm. Kernel variant of Support Vector Machine method. Gaussian processes.
  • Clustering
    Clustering as ill-posed problem. The k-means algorithm. Other approaches such as hierarchical clustering and DBSCAN.
  • Metrics of machine learning
    The role of metrics and losses in machine learning. Regression metrics such as MSE, RMSE, and MAE. Classification metrics such as precision, recall, and accuracy. Estimating quality of clustering.
  • Neural networks
    The model of neuron. Neural networks as complex differentiable functions. The gradient descent and chain differentiation as the backpropagation algorithm. Differentiable programming.
  • Convolutional neural networks
    Convolution as the image processing operation. The role of convolutions in feature detection. Convolutional neural layer. Modern architectures of convolutional neural networks.
  • Machine learning in production systems
    Essential rules of machine learning in production systems. The best practices from companies using machine learning.
Assessment Elements

Assessment Elements

  • Partially blocks (final) grade/grade calculation Homework
    A student should provide a Jupyter notebook.
  • Partially blocks (final) grade/grade calculation Homework
    A student should either provide a Jupyter notebook to the professor, or participate in an in-class Kaggle competition.
  • blocking Written exam
    Format: the exam is taken written (programming assignment). Platform: the exam is taken using MS Teams platform. Students are required to submit their assignment during the exam session (and no later). To join the exam in MS Teams students' hardware should meet the following requirements: https://docs.microsoft.com/ru-ru/microsoftteams/hardware-requirements-for-the-teams-app. A student is supposed to: check hardware compliance no later than 7 days before the exam; sign in with their corporate account; check the ability to join the meeting. If one of the requirements can't be met a student is obliged to inform a professor and a manager of a program 2 weeks before the exam date. Students are not allowed to involve any other person in their programming assignment. Any interaction with other students that gives advantage on the assignment is prohibited and so is any plagiarism in the programming assignment. Students are allowed to use any Internet resources and clarify their assignment with the professor.
  • Partially blocks (final) grade/grade calculation MOOC
    A student is graded in a binary way based on the completion of the MOOC.
Interim Assessment

Interim Assessment

  • Interim assessment (4 module)
    0.3 * Homework + 0.3 * Homework + 0.2 * MOOC + 0.2 * Written exam


Recommended Core Bibliography

  • D. Sculley, Todd Phillips, Dietmar Ebner, Vinay Chaudhary, & Michael Young. (n.d.). Machine Learning: The High-Interest Credit Card of Technical Debt. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsbas&AN=edsbas.BAEF1F2C
  • Hastie, T., Tibshirani, R., & Friedman, J. H. (2009). The Elements of Statistical Learning : Data Mining, Inference, and Prediction (Vol. Second edition, corrected 7th printing). New York: Springer. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=277008
  • Segaran, T. (2007). Programming Collective Intelligence : Building Smart Web 2.0 Applications. Beijing: O’Reilly Media. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=415280

Recommended Additional Bibliography

  • Caselles-Dupré, H., Lesaint, F., & Royo-Letelier, J. (2018). Word2Vec applied to Recommendation: Hyperparameters Matter. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsarx&AN=edsarx.1804.04212