内容简介:Chang Liu is an Applied Research Scientist and a member of the Georgian Impact team. She brings her in-depth knowledge of mathematical and combinatorial optimization to helping Georgian’s portfolio companies. Chang holds a Master’s of Applied Science in Op
About the speaker
Chang Liu is an Applied Research Scientist and a member of the Georgian Impact team. She brings her in-depth knowledge of mathematical and combinatorial optimization to helping Georgian’s portfolio companies. Chang holds a Master’s of Applied Science in Operations Research from the University of Toronto, where she specialized in combinatorial optimization. She also holds a Bachelor’s Degree in mathematics from the University of Waterloo.
About the talk
This talk will introduce differential privacy and its use cases, discuss the new component of the TensorFlow Privacy library, and offer real-world scenarios for how to apply the tools. In recent years, the world has become increasingly data-driven and individuals and organizations have developed a stronger awareness and concern for the privacy of their sensitive data.
It has been shown that it is impossible to disclose statistical results about a private database without revealing some information. In fact, the entire database could be recovered from a few query results.
Following research on the privacy of sensitive databases, a number of big players such as Google, Apple, and Uber have turned to differential privacy to help guarantee the privacy of sensitive data. That attention from major technology firms has helped bring differential privacy out of research labs and into the realm of software engineering and product development.
Differential privacy is now something that smaller firms and software startups are adopting and finding great value in. Apart from privacy guarantees, advances in differential privacy also allow businesses to unlock more capabilities and increased data utility.
One of these capabilities includes the ability to transfer knowledge from existing data through differentially private ensemble models without data privacy concerns. As differential privacy garners recognition in large tech companies, efforts to make current state-of-the-art research more accessible to the general public and small startups are underway.
As a contribution to the broader community, Georgian Partners has provided its differential privacy library to the TensorFlow community. Together, we will make differentially private stochastic gradient descent available in a user-friendly and easy-to-use API that allows users to train private logistic regression.
以上所述就是小编给大家介绍的《Building Differentially private Machine Learning Models Using TensorFlow Privacy》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。