Differential Privacy

database schema

Differential privacy is a new approach to data privacy, one that offers benefits to data subjects and data analysts. Data subject benefit from solid, mathematically-based guarantees of privacy, while data analysts are able to ask questions that would not be possible with more traditional forms of data protection.

Differential privacy automatically adjusts the accuracy of query results, giving results that are as accurate as possible while maintaining privacy. As long as queries pose no threat to privacy, a differentially private system can provide quite accurate results.

If a query is more sensitive, the system will provide a less accurate result. The more sensitive the query, the more noise is added to the result. All this is automatic: no one needs to anticipate how sensitive a query will be because the system determines the sensitivity from the data itself.

Differential privacy removes ad hoc restrictions on data access, focusing on the privacy implications of query results, not on the data fields per se. While traditional deidentification strategies remove some data fields on the assumption that they may pose a privacy risk, a differentially private system objectively determines whether a report would pose a privacy risk and does not limit results unnecessarily.

If you’d like to learn more about differential privacy and how it could benefit your business, please call or email to schedule a free initial consultation.

LET’S TALK

Trusted consultants to some of the world’s leading companies

Amazon, Facebook, Google, US Army Corp of Engineers, Amgen, Microsoft, Hitachi Data Systems