Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
The first “real” day on NeurIPS has been like a warm up. There has been three tutorial sessions followed by two invited talks. I have not been bussy at all.
Published:
I have been bouncing around the expo today. The expo has multiple tracks.
Published:
I am currently on my way to the NeurIPS’18 conference and I am so excited. I am really looking forward to see a lot of presentations of cool work. My expectations is that there will be so much good inspiration on what directions to look regarding my own work.
Published:
The K-Means problem is an unsupervised problem, where the task is to group elements into clusters such that elements in the same cluster are similar (for some measure of similarity) and points from different clusters are dissimilar.
Published:
Gaussian Mixtures is a probabilistic latent variable model, that assumes that observed data was generated by sampling from $K$ $d$-dimensional Gaussian distributions.
Published:
Many local post-hoc explainability techniques, such as DeConvNet, Guided Backprop, Layer-wise relevance propagation, and integrated gradients, rely on “gradient-like” computations, where explanations are propagated backwards through Neural Networks, one layer at a time. One can alter this backward computation to include attentions, which guides the explanation techniques to produce better explanations.
Published in ICML Workshop on Invertyible Neural Networks, Normalizing Flows, and Explicit Likelihood Models, 2020
The paper presents a fast parallel algorithm for multiplying an orthogonal matrix, parameterized by a sequence of Householder matrices, with an input vector
Download here
Published in arXiv, 2020
We present ideas on how to improve the highly popular explanation method LIME.
Download here
Published in arXiv, 2020
We prove that compositions of many Householder reflections can be replaced with one “auxillary reflection.” Such replacement yields higher GPU utilization and thus speeds up training and inference.
Download here
Published in NeurIPS 2020 (Spotlight), 2020
The paper presents a fast parallel algorithm suited for GPUs which works for a reparameterization of an SVD in a Houdsholder decomposition. We demonstrate high speed-ups in practice.
Download here
Published in arXiv, 2021
The paper presents a fast algorithm, FastFID, for computing the Fréchet Inception Distance (FID) on a small bacth. The algorithm allows monitoring the FID of GANs during training and even training GANs with FID as a loss.
Download here
Published in The British Machine Vision Conference (BMVC) 2021, 2021
We present the first one-pass-algorithm for generating semantically meaningful counterfactual image examples.
Download here
Published:
I gave an introduction to Tensorflow at an AUHack workshop on Machine Learning. The slides can be found here.
Published:
In a journal club that I am a part of, I gave a summary of the PatternNet and PatternAttribution article [1]. The presentation was done from a JupyterNotebook, which can be found here.
Published:
At a handout meeting in our research group, I presented the Neural Ordinary Differential Equations paper [1], which got best paper award at NeurIPS.
Published:
I gave a talk about Machine Learning at Aarhus University. I spent 45 minutes on convincing high-school students that what they learn in math is cool. The talk (with these slides) covered the basic idea about loss-functions and gradient descent on a simple data set generated from a 4th-order polynomial and then it motivated how we can use such tricks to train deep neural networks.
Published:
I gave a 45 minute overview of what we did in the paper Backpropagating through Fréchet Inception Distance. The slides can be found here.
Bachelor Course, AU, Dept. of Computer Science, 2016
The Databases course covers basic SQL, relational algebra, relational calculus, normal forms and concurrency control.
Bachelor Course, AU, Dept. of Computer Science, 2016
The Pervasive Computing (Operating Systems) course covers topics within pervasive and mobile computing along with topics like threading, scheduling, and resource management.
Bachelor Course, AU, Dept. of Computer Science, 2016
The Software Architecture course covers topics like design patterns, test-driven development, and test-driven development.
Bachelor Course, AU, Dept. of Computer Science, 2017
Awarded TA of the year at Computer Science (AU) for teaching this course.
Bachelor Course, AU, Dept. of Computer Science, 2018
The Machine Learning course at Aarhus University covering basic learning theory, supervised, and unsupervised learning methods. For more info see here.
Bachelor Course, AU, Dept. of Computer Science, 2019
The Introduction to Databases and Implementation and Applications fo Databases courses cover basic SQL, relational algebra, relational calculus, normal forms, concurrency control, data-ware houses, etc.
Master Course, AU, Dept. of Computer Science, 2020
The Data Mining Course at Aarhus University was a brand new course, which covered topics like (subspace) clustering, outlier detection, spectral graph theory, graph mining and pattern mining.
Master Course, SDC, Neuroscience & Neuroimaging, 2020
The Pattern Recognition and Predictive Modelling in Neuroscience course at the Sino-Danish Center (SDC) in Beijing is an introductory Machine Learning course for neuro-students. It covers concepts like regression, classification, clustering and dimensionality reduction.