Gradient descent, how neural networks learn | Chapter 2, deep learning – YouTube

Spread the love

Gradient descent, how neural networks learn | Chapter 2, deep learning

Subscribe for more (part 3 will be on backpropagation):
Thanks to everybody supporting on Patreon.

For any early stage ML startup founders, Amplify Partners would love to hear from you via [email protected]

To learn more, I highly recommend the book by Michael Nielsen
The book walks through the code behind the example in these videos, which you can find here:

MNIST database:

Also check out Chris Olah’s blog:
His post on Neural networks and topology is particular beautiful, but honestly all of the stuff there is great.

And if you like that, you’ll *love* the publications at distill:

For more videos, Welch Labs also has some great series on machine learning:

“But I’ve already voraciously consumed Nielsen’s, Olah’s and Welch’s works”, I hear you say. Well well, look at you then. That being the case, I might recommend that you continue on with the book “Deep Learning” by Goodfellow, Bengio, and Courville.

Thanks to Lisha Li (@lishali88) for her contributions at the end, and for letting me pick her brain so much about the material. Here are the articles she referenced at the end:

Music by Vincent Rubinetti:


3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you’re into that).

If you are new to this channel and want to see more, a good place to start is this playlist:

Various social media stuffs:

Copyright © 2018 NEURALSCULPT