Learning the tools that preserve user privacy is going to become an increasingly important skillset for all aspiring data scientists to learn in the coming months. Legal frameworks like GDPR are being proposed all around the world as people realize how valuable their data is, so data scientists need to accept that they'll have to handle data differently than in the past. In this video, I'll demo 3 important privacy techniques; differential privacy, secure multi party computation, and federated learning. We'll use these techniques to train a mode built with Python l to predict diabetes while keep user data anonymous. Enjoy!
Code for this video:
https://github.com/OpenMined/PySyft/tree/master/examples/tutorials
Please Subscribe! And Like. And comment. Thats what keeps me going.
Want more education? Connect with me here:
Twitter: https://twitter.com/sirajraval
instagram: https://www.instagram.com/sirajraval
Facebook: https://www.facebook.com/sirajology
More learning resources:
https://www.openmined.org/
https://iamtrask.github.io/2017/03/17/safe-ai/
https://mortendahl.github.io/
https://florian.github.io/federated-learning/
https://towardsdatascience.com/whats-new-in-deep-learning-research-understanding-federated-learning-b14e7c3c6f89
http://www.cleverhans.io/privacy/2018/04/29/privacy-and-machine-learning.html
Join us at the School of AI:
https://theschool.ai/
Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/
Please support me on Patreon:
https://www.patreon.com/user?u=3191693
Signup for my newsletter for exciting updates in the field of AI:
https://goo.gl/FZzJ5w
Hiring? Need a Job? See our job board!:
https://www.theschool.ai/jobs/
Need help on a project? See our consulting group:
https://www.theschool.ai/consulting-group/