Backpropagation explained | Part 2 – The mathematical notation

Spread the love

Backpropagation explained | Part 2 - The mathematical notation

In our last video on backpropagation, we covered the intuition behind what backpropagation’s role is during the training of an artificial neural network. Now, we’re going to focus on the math that’s underlying backprop. The math is pretty involved, and so we’re going to break it up into bite-sized chunks across a few videos.

We’re going to start out in this video by first quickly recapping how backpropagation is used during the training process. Then, we’ll jump over to the math side of things and open our discussion up by going over the notation and definitions that we’ll be using for our backprop calculations going forward.

These definitions and the notation will be the focus of this video. The math underlying backprop all relies heavily on what we’ll get introduced to here, so it’s crucial that these things are understood before moving forward.

Lastly, we’ll narrow our focus to discuss the several indices that the notation depends on.

Follow deeplizard on Twitter:

Follow deeplizard on Steemit:
https://steemit.com/@deeplizard

Support deeplizard:
Bitcoin: 1AFgm3fLTiG5pNPgnfkKdsktgxLCMYpxCN
Litecoin: LTZ2AUGpDmFm85y89PFFvVR5QmfX6Rfzg3
Ether: 0x9105cd0ecbc921ad19f6d5f9dd249735da8269ef

Become a patron:
https://www.patreon.com/deeplizard

Copyright © 2018 NEURALSCULPT.COM