Deep Painterly Harmonization

Posted on Thu 03 May 2018 in Experiments • Tagged with Deep Learning, Style Transfer

In this article we'll decode the research article with the same name and get some cool results integrating random objects in paintings while preserving their style.


Continue reading

Pointer cache for Language Model

Posted on Thu 26 April 2018 in Experiments • Tagged with Deep Learning, NLP

You can easily boost the performance of a language model based on RNNs by adding a pointer cache on top of it. The idea was introduce by Grave et al. and their results showed how this simple technique can make your perplexity decrease by 10 points without additional training. This sounds exciting, so let's see what this is all about and implement that in pytorch with the fastai library.


Continue reading

Recurrent Neural Network

Posted on Sat 14 April 2018 in Basics • Tagged with Deep Learning, NLP

In Natural Language Processing, traditional neural networks struggle to properly execute the task we give them. To predict the next work in a sentence for instance, or grasp its meaning to somehow classify it, you need to have a structure that can keeps some memory of the words it saw before. That's why Recurrent Neural Network have been designed to do, and we'll look into them in this article.


Continue reading

The 1cycle policy

Posted on Sat 07 April 2018 in Experiments • Tagged with Deep Learning, SGD, Learning Rate

Properly setting the hyper-parameters of a neural network can be challenging, fortunately, there are some recipe that can help.


Continue reading

Convolution in depth

Posted on Thu 05 April 2018 in Basics • Tagged with Deep Learning, Convolution

CNNs (Convolutional Neural Network) are the most powerful networks used in computer vision. Let's see what a convolutional layer is all about, from the definition to the implementation in numpy, even with the back propagation.


Continue reading

SGD Variants

Posted on Thu 29 March 2018 in Basics • Tagged with SGD, Deep Learning

Let's get a rapid overview and implementations of the common variants of SGD.


Continue reading

A simple neural net in numpy

Posted on Tue 20 March 2018 in Basics • Tagged with Neural Net, Back Propagation

Now that we have seen how to build a neural net in pytorch, let's try to take it a step further and try to do the same thing in numpy.


Continue reading

How Do You Find A Good Learning Rate

Posted on Tue 20 March 2018 in Basics • Tagged with SGD, Learning Rate

This is the main hyper-parameter to set when we train a neural net, but how do you determine the best value? Here's a technique to quickly decide on one.


Continue reading

A Neural Net In Pytorch

Posted on Fri 16 March 2018 in Basics • Tagged with Neural net, Pytorch, Deep learning

The theory is all really nice, but let's actually build a neural net and train it! We'll see how a simple neural net with one hidden layer can learn to recognize digits very efficiently.


Continue reading

What Is Deep Learning?

Posted on Tue 13 March 2018 in Basics • Tagged with Neural Net, SGD, Deep Learning

What is deep learning? It's a class of algorithms where you train something called a neural net to complete a specific task. Let's begin with a general overview and we will dig into the details in subsequent articles.


Continue reading