ai
-
Introduction In the previous blog post we took a look at the backpropagation through time algorithm. We saw how, looking at the unfolded computation graph, backpropagation through time is essentially the same as backpropagating through one long connected feedforward neural net. Now that we have the gradient of the loss with respect to the parameters…
-
Introduction In the previous blog post, we began the implementation of our feedforward neural net, and went over the equations and code for the forward pass. With this, we can now feed our data through the network, but its not of much use without being able to train it. In order to train our recurrent…
-
Introduction In the previous blog post we talked about what it means exactly for data to have a “sequential topology”, and prepared our training data to be fed into the recurrent neural net. In this post, we’ll be looking at the design of our simple recurrent neural net, the equations for the forward pass, and…
-
Introduction In the previous blog post we looked at how a recurrent neural network differs from a feedforward neural network, and why they are better at sequence processing tasks. In this blog post, we will be preparing our training data, and start writing some code. Our training data is just a list of lowercase names.…
-
Introduction In a previous series of blog posts I covered feedforward neural networks. Feedforward neural networks are very powerful, but they are not the only neural network architecture available and they may be ill-suited to certain tasks. Recurrent neural nets are a class of neural nets that are particularly effective for modeling data with a…
-
Introduction This is the seventh installment in a series of blog posts about Support Vector Machines. If you have not read the first six blog posts, I highly recommend you go back and read those before continuing to this blog post. Before, we implemented SVM using a Quadratic Programming solver in Python called CVXOPT. This worked…
-
Introduction This is the sixth installment in a series of blog posts about Support Vector Machines. If you have not read the first five blog posts, I highly recommend you go back and read those before continuing on to this blog post. Last time we looked at kernels and the kernel trick, which allowed us to…
-
Introduction This is the fifth installment in a series of blog posts about Support Vector Machines. If you have not read the four other blog posts, I highly recommend you go back and read those before continuing to this blog post. Last time we introduced soft-margin SVM, which was a new formulation that could accommodate…
-
Introduction This is the fourth installment in a series of blog posts about Support Vector Machines. If you have not read the first three blog posts, I highly recommend you go back and read those before continuing to this blog post. Last time we talked about Quadratic Programming, and successfully used a QP solver to…
-
Introduction This is the second installment in a series of blog posts about Support Vector Machines. If you have not read the first blog post, I highly recommend you go back and read it before continuing to this blog post. Last time we looked at how to define the ideal hyperplane to linearly separate two…