• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Student
Title
Supervisor
Faculty
Educational Programme
Final Grade
Year of Graduation
Aleksey Khachiyants
Sequential Learning of Sparse ResNet Blocks
7
2019
Neural network sparsification is an active field of research that has several practical applications such as model compression for mobile devices. There are two main approaches: weight pruning and Bayesian sparsification. The second method generally produces better results and allows training models from scratch. However, this statement is not correct for deep models. Experiments show that training of deep convolutional networks with sparse variational dropout does not converge. This graduation thesis suggests using layerwise learning as a workaround and explores its applicability to residual networks. Experiments show that sequential sparsification approach can be applied to deep residual networks. Moreover, models that are sparsified using this approach have high compression ratio and they do not suffer from serious accuracy loss.

Student Theses at HSE must be completed in accordance with the University Rules and regulations specified by each educational programme.

Summaries of all theses must be published and made freely available on the HSE website.

The full text of a thesis can be published in open access on the HSE website only if the authoring student (copyright holder) agrees, or, if the thesis was written by a team of students, if all the co-authors (copyright holders) agree. After a thesis is published on the HSE website, it obtains the status of an online publication.

Student theses are objects of copyright and their use is subject to limitations in accordance with the Russian Federation’s law on intellectual property.

In the event that a thesis is quoted or otherwise used, reference to the author’s name and the source of quotation is required.

Search all student theses