In this video, we’ll see the math that explains how backpropagation works backwards through a neural network.
We’ve seen how to calculate the gradient of the loss function using backpropagation in the previous video. We haven’t yet seen though where the backwards movement comes into play that we talked about when we discussed the intuition for backprop.
So now, we’re going to build on the knowledge that we’ve already developed to understand what exactly puts the back in backpropagation. The explanation we’ll give for this will be math-based, so we’re first going to start out by exploring the motivation needed for us to understand the calculations we’ll be working through.
We’ll then jump right into the calculations, which, we’ll see, are actually quite similar to ones we’ve worked through in the previous video.
After we’ve got the math down, we’ll then bring everything together to achieve the mind-blowing realization for how these calculations are mathematically done in a backwards fashion.
Follow deeplizard on Twitter:
Tweets by deeplizard
Follow deeplizard on Steemit:
Become a patron:
The Most Human Human: What Artificial Intelligence Teaches Us About Being Alive: http://amzn.to/2GtjKqu