Listen

Description

The most popular optimization strategy in machine learning is called gradient descent. When gradient descent is applied to neural networks, its called back-propagation. In this video, i'll use analogies, animations, equations, and code to give you an in-depth understanding of this technique. Once you feel comfortable with back-propagation, everything else becomes easier. It uses calculus to help us update our machine learning models. Enjoy!

Code for this video:
https://github.com/llSourcell/backpropagation_explained

Please Subscribe! And like. And comment. That's what keeps me going.

Want more education? Connect with me here:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology
instagram: https://www.instagram.com/sirajraval

This video is apart of my Machine Learning Journey course:
https://github.com/llSourcell/Machine_Learning_Journey

More learning resources:
https://www.youtube.com/watch?v=XdM6ER7zTLk
https://www.youtube.com/watch?v=nhqo0u1a6fw
https://www.youtube.com/watch?v=jc2IthslyzM
https://www.youtube.com/watch?v=IHZwWFHWa-w
https://www.youtube.com/watch?v=umAeJ7LMCfU
http://neuralnetworksanddeeplearning.com/chap2.html

Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/

Sign up for the next course at The School of AI:
https://www.theschool.ai

And please support me on Patreon:
https://www.patreon.com/user?u=3191693