• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Introduction to Deep Learning

2020/2021
Academic Year
ENG
Instruction in English
5
ECTS credits
Course type:
Compulsory course
When:
3 year, 1-3 module

Instructors


Gadetsky, Artyom


Kovalev, Alexey


Хайдуров Руслан Александрович


Tsvigun, Akim

Course Syllabus

Abstract

The goal of this course is to give learners basic understanding of modern neural networks and their applications in computer vision and natural language understanding. The course starts with a recap of linear models and discussion of stochastic optimization methods that are crucial for training deep neural networks. Learners will study all popular building blocks of neural networks including fully connected layers, convolutional and recurrent layers. Learners will use these building blocks to define complex modern architectures in TensorFlow and Keras frameworks. In the course project learner will implement deep neural network for the task of image captioning which solves the problem of giving a text description for an input image. The course is based on MOOC “Introduction to deep learning”: https://ru.coursera.org/learn/intro-to-deep-learning.
Learning Objectives

Learning Objectives

  • To familiarize students with the basic concepts, models and algorithms of neural networks
Expected Learning Outcomes

Expected Learning Outcomes

  • Know principles of neural network models
  • Have skills in training and applying basic neural network models
Course Contents

Course Contents

  • Introduction to optimization
    In the first week you'll learn about linear models and stochatic optimization methods. Linear models are basic building blocks for many deep architectures, and stochastic optimization is used to learn every model that we'll discuss in our course.
  • Introduction to neural networks
    This module is an introduction to the concept of a deep neural network. You'll begin with the linear model and finish with writing your very first deep network.
  • Deep Learning for images
    In this week you will learn about building blocks of deep learning for image input. You will learn how to build Convolutional Neural Network (CNN) architectures with these blocks and how to quickly solve a new task using so-called pre-trained models.
  • Unsupervised representation learning
    This week we're gonna dive into unsupervised parts of deep learning. You'll learn how to generate, morph and search images with deep learning.
  • Deep learning for sequences
    In this week you will learn how to use deep learning for sequences such as texts, video, audio, etc. You will learn about several Recurrent Neural Network (RNN) architectures and how to apply them for different tasks with sequential input/output.
  • First programming project
    In this week you will apply all your knowledge about neural networks for images and texts for the project. You will solve the task of generating descriptions for real world images!
  • Second programming project
    Final project
Assessment Elements

Assessment Elements

  • non-blocking Online course
    There is no exam. The course grade is given according to the cumulative assessment.
  • non-blocking Tests
    There is no exam. The course grade is given according to the cumulative assessment.
  • non-blocking Homeworks
    There is no exam. The course grade is given according to the cumulative assessment.
Interim Assessment

Interim Assessment

  • Interim assessment (3 module)
    0.4 * Homeworks + 0.3 * Online course + 0.3 * Tests
Bibliography

Bibliography

Recommended Core Bibliography

  • Гудфеллоу Я., Бенджио И., Курвилль А. - Глубокое обучение - Издательство "ДМК Пресс" - 2018 - 652с. - ISBN: 978-5-97060-618-6 - Текст электронный // ЭБС ЛАНЬ - URL: https://e.lanbook.com/book/107901

Recommended Additional Bibliography

  • Christopher M. Bishop. (n.d.). Australian National University Pattern Recognition and Machine Learning. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsbas&AN=edsbas.EBA0C705