Publications

Backpropagating through Fréchet Inception Distance

Published in arXiv, 2021

The paper presents a fast algorithm, FastFID, for computing the Fréchet Inception Distance (FID) on a small bacth. The algorithm allows monitoring the FID of GANs during training and even training GANs with FID as a loss.

Download here

What if Neural Networks had SVDs?

Published in NeurIPS 2020 (Spotlight), 2020

The paper presents a fast parallel algorithm suited for GPUs which works for a reparameterization of an SVD in a Houdsholder decomposition. We demonstrate high speed-ups in practice.

Download here

One Reflection Suffice

Published in arXiv, 2020

We prove that compositions of many Householder reflections can be replaced with one “auxillary reflection.” Such replacement yields higher GPU utilization and thus speeds up training and inference.

Download here

Faster Orthogonal Parameterization with Householder Matrices

Published in ICML Workshop on Invertyible Neural Networks, Normalizing Flows, and Explicit Likelihood Models, 2020

The paper presents a fast parallel algorithm for multiplying an orthogonal matrix, parameterized by a sequence of Householder matrices, with an input vector

Download here