• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Variance Reduction Methods in Stochastic Computational Graphs

Student: Alanov Aybek

Supervisor: Dmitry Vetrov

Faculty: Faculty of Computer Science

Educational Programme: Applied Mathematics and Information Science (Bachelor)

Year of Graduation: 2017

Recent advances in deep variational inference methods lead research into scaling the training process to large models. There was a breakthrough in estimating the gradient in stochastic neural networks with continuous latent variables which allow training these models for large datasets. Now the attention of researchers focuses on the problem of estimating the gradient for models with discrete latent variables. This paper considers the state-of-the-art variance reduction methods in stochastic neural networks with discrete latent variables. A comprehensive comparison between these methods based on their performances across a range of real-world datasets was provided.

Student Theses at HSE must be completed in accordance with the University Rules and regulations specified by each educational programme.

Summaries of all theses must be published and made freely available on the HSE website.

The full text of a thesis can be published in open access on the HSE website only if the authoring student (copyright holder) agrees, or, if the thesis was written by a team of students, if all the co-authors (copyright holders) agree. After a thesis is published on the HSE website, it obtains the status of an online publication.

Student theses are objects of copyright and their use is subject to limitations in accordance with the Russian Federation’s law on intellectual property.

In the event that a thesis is quoted or otherwise used, reference to the author’s name and the source of quotation is required.

Search all student theses