• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Compression and Acceleration of Deep Neural Networks for Banking Text Data Classification

Student: Umanetc Ekaterina

Supervisor: Alexey Masyutin

Faculty: Faculty of Computer Science

Educational Programme: Financial Technology and Data Analysis (Master)

Year of Graduation: 2020

In this paper, we use two representative compression and acceleration techniques. Firstmethod is group-LASSO regularization BiLSTM. This method removes componentsfrom model and simultaneously decreases the sizes of all basic structures of BiLSTM.The algorithm is divided into two types, depending on whether the forward and re-verse direction are considered as one BiLSTM or as two independent LSTMs. Secondaproach is knowledge destilation from russian version RuBERT into BiLSTM. Experi-ments show that both methods and their combination can provide significant speedupsand compression rates on large models under acceptable performance loss.

Student Theses at HSE must be completed in accordance with the University Rules and regulations specified by each educational programme.

Summaries of all theses must be published and made freely available on the HSE website.

The full text of a thesis can be published in open access on the HSE website only if the authoring student (copyright holder) agrees, or, if the thesis was written by a team of students, if all the co-authors (copyright holders) agree. After a thesis is published on the HSE website, it obtains the status of an online publication.

Student theses are objects of copyright and their use is subject to limitations in accordance with the Russian Federation’s law on intellectual property.

In the event that a thesis is quoted or otherwise used, reference to the author’s name and the source of quotation is required.

Search all student theses