Google just made it really easy for anyone to use a "Tensor Processing Unit" (TPU) to train machine learning models. Their open-source research tool (Google Colab) now lets developer select a "TPU" as their run-time environment. Are TPUs the next big thing in machine learning? In this video, I'll benchmark the TPU vs the GPU, talk about what the hardware looks like, describe its use cases, then do some TPU specific live programming to train a model to do some natural language processing. Get hype!
Code is here:
https://github.com/llSourcell/TPU_Machine_Learning
Please Subscribe! And like. And comment. That's what keeps me going.
Want more education? Connect with me here:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology
instagram: https://www.instagram.com/sirajraval
The School of AI:
https://www.theschool.ai
Github Syllabus:
https://github.com/llSourcell/Move_37_Syllabus
More learning resources:
https://cloud.google.com/blog/products/gcp/an-in-depth-look-at-googles-first-tensor-processing-unit-tpu
https://medium.com/@CPLu/should-we-all-embrace-systolic-array-df3830f193dc
https://medium.com/intuitionmachine/googles-ai-processor-is-inspired-by-the-heart-d0f01b72defe
https://github.com/UCSBarchlab/OpenTPU
Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/
And please support me on Patreon:
https://www.patreon.com/user?u=3191693