Differential Privacy — Under the Hood
Apple released a little more information about #DifferentialPrivacy with a paper and an excellent blog post (°).
A quick refresher - Under differential privacy, you
A quick refresher - Under differential privacy, you
- • Collect user’s data (locally, or on the server)
- • Add noise to mask the data, but in such a way that when you
- • Average out the data, the noise averages out, and meaningful information emerges
The trick, of course, is in the details, at the very least, you need to be careful in
- • calibrating the noise so that it averages out (algorithms matter!)
- • collecting and transmitting information in a manner that protects privacy (secure storage! encryption!)
- • keeping the information under privacy exposure thresholds (don’t combine data across different use cases! restrict the number of use cases!)
etc.
It's a fascinating read, and, I suspect, the very beginning of the field. I expect signal processing (Huffman coding, etc) to come into play any day now
(°) Learning with Privacy at Scale - the paper - https://goo.gl/TKvTae, and the blog post - https://goo.gl/XbJcT4
Comments