Listen

Description

Recurrent Networks can be improved to remember long range dependencies by using whats called a Long-Short Term Memory (LSTM) Cell. Let's build one using just numpy! I'll go over the cell components as well as the forward and backward pass logic.

Code for this video:
https://github.com/llSourcell/LSTM_Networks

Please Subscribe! And like. And comment. Thats what keeps me going.

More learning resources:
https://www.youtube.com/watch?v=ftMq5ps503w
https://www.youtube.com/watch?v=cdLUzrjnlr4
https://www.youtube.com/watch?v=hWgGJeAvLws
http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
https://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/

Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/

And please support me on Patreon:
https://www.patreon.com/user?u=3191693
Follow me:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology Instagram: https://www.instagram.com/sirajraval/ Instagram: https://www.instagram.com/sirajraval/