Posts by Collection

mlnotes

K-Means Clutering

4 minute read

Published:

The K-Means problem is an unsupervised problem, where the task is to group elements into clusters such that elements in the same cluster are similar (for some measure of similarity) and points from different clusters are dissimilar.

Gaussian Mixtures

11 minute read

Published:

Gaussian Mixtures is a probabilistic latent variable model, that assumes that observed data was generated by sampling from $K$ $d$-dimensional Gaussian distributions.

projects

Attention Mechanisms for Explanations

Published:

Many local post-hoc explainability techniques, such as DeConvNet, Guided Backprop, Layer-wise relevance propagation, and integrated gradients, rely on “gradient-like” computations, where explanations are propagated backwards through Neural Networks, one layer at a time. One can alter this backward computation to include attentions, which guides the explanation techniques to produce better explanations.

publications

Faster Orthogonal Parameterization with Householder Matrices

Published in ICML Workshop on Invertyible Neural Networks, Normalizing Flows, and Explicit Likelihood Models, 2020

The paper presents a fast parallel algorithm for multiplying an orthogonal matrix, parameterized by a sequence of Householder matrices, with an input vector

Download here

One Reflection Suffice

Published in arXiv, 2020

We prove that compositions of many Householder reflections can be replaced with one “auxillary reflection.” Such replacement yields higher GPU utilization and thus speeds up training and inference.

Download here

What if Neural Networks had SVDs?

Published in NeurIPS 2020 (Spotlight), 2020

The paper presents a fast parallel algorithm suited for GPUs which works for a reparameterization of an SVD in a Houdsholder decomposition. We demonstrate high speed-ups in practice.

Download here

Backpropagating through Fréchet Inception Distance

Published in arXiv, 2021

The paper presents a fast algorithm, FastFID, for computing the Fréchet Inception Distance (FID) on a small bacth. The algorithm allows monitoring the FID of GANs during training and even training GANs with FID as a loss.

Download here

talks

U-Days Presentation About Machine Learning

Published:

I gave a talk about Machine Learning at Aarhus University. I spent 45 minutes on convincing high-school students that what they learn in math is cool. The talk (with these slides) covered the basic idea about loss-functions and gradient descent on a simple data set generated from a 4th-order polynomial and then it motivated how we can use such tricks to train deep neural networks.

teaching

Databases

Bachelor Course, AU, Dept. of Computer Science, 2016

The Databases course covers basic SQL, relational algebra, relational calculus, normal forms and concurrency control.

Introduction to Databases

Bachelor Course, AU, Dept. of Computer Science, 2017

Awarded TA of the year at Computer Science (AU) for teaching this course.

Machine Learning

Bachelor Course, AU, Dept. of Computer Science, 2018

The Machine Learning course at Aarhus University covering basic learning theory, supervised, and unsupervised learning methods. For more info see here.

Data Mining

Master Course, AU, Dept. of Computer Science, 2020

The Data Mining Course at Aarhus University was a brand new course, which covered topics like (subspace) clustering, outlier detection, spectral graph theory, graph mining and pattern mining.