Let's discuss the math behind back-propagation. We'll go over the 3 terms from Calculus you need to understand it (derivatives, partial derivatives, and the chain rule and implement it programmatically.
Code for this video:
https://github.com/llSourcell/how_to_do_math_for_deep_learning
Please Subscribe! And like. And comment. That's what keeps me going.
I've used this code in a previous video. I had to keep the code as simple as possible in order to add on these mathematical explanations and keep it at around 5 minutes.
More Learning resources:
https://mihaiv.wordpress.com/2010/02/08/backpropagation-algorithm/
http://outlace.com/Computational-Graph/
http://briandolhansky.com/blog/2013/9/27/artificial-neural-networks-backpropagation-part-4
https://jeremykun.com/2012/12/09/neural-networks-and-backpropagation/
https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/
Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/
And please support me on Patreon:
https://www.patreon.com/user?u=3191693
Forgot to add my patron shoutout at the end so special thanks to Patrons Tim Jiang, HG Oh, Hoang, Advait Shinde, Vijay Daniel & Umesh Rangasamy
Follow me:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology Instagram: https://www.instagram.com/sirajraval/ Instagram: https://www.instagram.com/sirajraval/