• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Deep Equilibrium Residual Networks

Student: Sergey Troshin

Supervisor: Ekaterina Lobacheva

Faculty: Faculty of Computer Science

Educational Programme: Applied Mathematics and Information Science (Bachelor)

Final Grade: 10

Year of Graduation: 2020

In this work, we study the problem of memory consumption of very deep neural networks. We inspect the applicability of the recently introduced method, based on implicit differentiation through the equilibrium point, to general feedforward architectures. We show how this method can be applied to residual neural networks, analyze the properties of such models, and report the performance on the CIFAR-10 dataset. We report that residual networks with implicit differentiation through the equilibrium point demonstrate slightly lower performance compared with the basic residual networks and also need more time for training. Finally, we discuss when the application of this method is really memory efficient.

Full text (added May 20, 2020)

Student Theses at HSE must be completed in accordance with the University Rules and regulations specified by each educational programme.

Summaries of all theses must be published and made freely available on the HSE website.

The full text of a thesis can be published in open access on the HSE website only if the authoring student (copyright holder) agrees, or, if the thesis was written by a team of students, if all the co-authors (copyright holders) agree. After a thesis is published on the HSE website, it obtains the status of an online publication.

Student theses are objects of copyright and their use is subject to limitations in accordance with the Russian Federation’s law on intellectual property.

In the event that a thesis is quoted or otherwise used, reference to the author’s name and the source of quotation is required.

Search all student theses