Listen

Description

Have you ever wondered what the math behind neural networks looks like? What gives them such incredible power? We're going to cover 4 different neural networks in this video to develop an intuition around their basic principles (2 feedforward networks, 1 recurrent network, and a self-organizing map). Prepare yourself, deep learning is coming.

Code for this video (with coding challenge):
https://github.com/llSourcell/neural_networks

Hammad's winning code:
https://github.com/hammadshaikhha/Math-of-Machine-Learning-Course-by-Siraj/tree/master/Regularization%20in%20Linear%20Regression

Ong's runner-up code:
https://github.com/jrios6/Math-of-Intelligence/tree/master/3-Regularization

More learning resources:
https://www.youtube.com/watch?v=h3l4qz76JhQ
http://www.ai-junkie.com/ann/som/som1.html
http://iamtrask.github.io/2015/07/12/basic-python-network/
https://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
https://www.youtube.com/watch?v=vOppzHpvTiQ&list=PL2-dafEMk2A7YdKv4XfKpfbTH5z6rEEj3

Please subscribe! And like. And comment. That's what keeps me going.

And please support me on Patreon: https://www.patreon.com/user?u=3191693
Follow me:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology Instagram: https://www.instagram.com/sirajraval/ Instagram: https://www.instagram.com/sirajraval/