Backpropagation explained | Part 5 – What puts the “back” in backprop?

In this video, we’ll see the math that explains how backpropagation works backwards through a neural network. We’ve seen how to calculate the gradient of the loss function using backpropagation in the previous video. We haven’t yet seen though where the backwards movement comes into…

Backpropagation explained | Part 4 – Calculating the gradient

We’re now on video number 4 in our journey through understanding backpropagation. In our last video, we focused on how we can mathematically express certain facts about the training process. Now we’re going to be using these expressions to help us differentiate the loss of…

Backpropagation explained | Part 2 – The mathematical notation

In our last video on backpropagation, we covered the intuition behind what backpropagation’s role is during the training of an artificial neural network. Now, we’re going to focus on the math that’s underlying backprop. The math is pretty involved, and so we’re going to break…

Backpropagation explained

In this video, we’re going to discuss backpropagation and what its role is in the training process of a neural network. We’re going to start out by first going over a quick recap of some of the points about Stochastic Gradient Descent that we learned…

Gradient descent, how neural networks learn | Chapter 2, deep learning – YouTube

Subscribe for more (part 3 will be on backpropagation): http://3b1b.co/subscribe Thanks to everybody supporting on Patreon. https://www.patreon.com/3blue1brown http://3b1b.co/nn2-thanks For any early stage ML startup founders, Amplify Partners would love to hear from you via [email protected] To learn more, I highly recommend the book by Michael…