Recurrent Neural Networks LSTMs and Vanishing & Exploding Gradients – Fun and Easy Machine Learning
Limited Time – Discount Coupon
Recurrent Neural Networks (RNNs) are popular models that have shown great promise in many Natural Language Processing or NLP tasks. So for example if you ask google assistant who landed on the moon first followed by the question how old is He.You will be replied to with age of Niel Armstrong and . Now if you asked this to Siri, it wont be able to answer the second question because it only looks at the current input and does not remember the questions you had prior. Well until they eventually update the next iOS anyway.
If you are new to Neural Networks please check out my lecture on Artificial Neural Networks and Convolutional Neural networks for this lecture to make sense.
This lecture we deal with Recurrent Neural Networks, Long-Term Short Memory or LSTM’s and we also discuss about Vanishing Gradient and Exploding gradient problems.
Thank you to Denny Britz from WildML blog for use and permission of his work and diagrams in this video. Here is a link to his blog for more detail on RNN’s and the mathematics that powers it.
Support us on Patreon, so we can bring you more cool Machine and Deep Learning Content 🙂
To learn more on Augmented Reality, IoT, Machine Learning FPGAs, Arduinos, PCB Design and Image Processing then Check out
Please like and Subscribe for more videos 🙂