Posts ✏️
Although I do enjoy digging my feet into newer tricks and techniques when competiting, I really like problems that test your understanding of an algorithm or data structure at a deeper level.
In the previous post I explained the basic idea behind using maximum likelihood estimation to fit parametric probability distributions over data. While this works for cases where
One of the very fundamental problems that machine learning seeks to solve is trying to predict outcomes under certain degrees of uncertainty. But before we get to the heart of the probability theory that works behind the scenes let’s start small. Even if we have little to no familiarity with how machine learning works, most of us know the very basic form of prediction model- the linear curve fit. I have a chair.
Following up on the PixelRNN and PixelCNN architectures, we look at a paper that uses the distrib
Although GANs are great at producing realistic images, the distributions they learn are implicit. For many applications, having a model that can learn the distribution of natural images
When you’re too lazy to get a neural net to learn from data you get it to learn how to do that too.
The robot civil rights movement has gained wind since Francois Chollet blessed us with Keras.
One of the hardest challenges in making dank memes is cropping out subjects from the background well. It’d be swell if we could get AI to do the heavy lifting for us?
According to Geoffrey Hinton, Generative Adversarial Networks are the most remarkable idea
After playing around with some searching and shortest path algorithms in my previous post, it’s always a good idea to put them to practice with a fun programming problem.