Fix a rare privacy bug in DistinctPerKey in Privacy on Beam. (#84)

The bug occurred when there are outlier users in the input that
contribute to many partitions and/or to many values AND the
values contributed are the same as values from other users (the second
part is critical, if the contributed values only come from a single user
then the bug does not occur). Then, the output might not have be DP due
to incorrect contribution bounding. See the comments in the newly added
tests for concrete examples of when/how the bug used to occur.

This is cherry-picked from the main branch commit
e149618d032f97b476fecc5839a278cc680def08.

This commit includes a minor change compared to the one on the main
branch: It removes the error output for kv.Codec.Encode/Decode
functions used in aggregations.go since the error output for these
functions weren't implemented in v1.0.0.
6 files changed
tree: 9d89161610354fddc97b57b3f4e81755d677a898
  1. cc/
  2. common_docs/
  3. examples/
  4. experiments/
  5. go/
  6. java/
  7. privacy-on-beam/
  8. proto/
  9. python/
  10. .bazelversion
  11. .clang-format
  12. BUILD
  13. CONTRIBUTING.md
  14. differential_privacy.md
  15. differential_privacy_deps.bzl
  16. LICENSE
  17. README.md
  18. WORKSPACE
README.md

Differential Privacy

This repository contains libraries to generate ε- and (ε, δ)-differentially private statistics over datasets. It contains the following tools.

  • Privacy on Beam is an end-to-end differential privacy framework built on top of Apache Beam. It is intended to be easy to use, even by non-experts.
  • Three “DP building block” libraries, in C++, Go, and Java. These libraries implement basic noise addition primitives and differentially private aggregations. Privacy on Beam is implemented using these libraries.
  • A stochastic tester, used to help catch regressions that could make the differential privacy property no longer hold.
  • A differential privacy accounting library, used for tracking privacy budget.
  • A command line interface for running differentially private SQL queries with ZetaSQL.

To get started on generating differentially private data, we recomend you follow the Privacy on Beam codelab.

Currently, the DP building block libraries support the following algorithms:

AlgorithmC++GoJava
Laplace mechanismSupportedSupportedSupported
Gaussian mechanismSupportedSupportedSupported
CountSupportedSupportedSupported
SumSupportedSupportedSupported
MeanSupportedSupportedSupported
VarianceSupportedPlannedPlanned
Standard deviationSupportedPlannedPlanned
QuantilesSupportedSupportedSupported
Automatic bounds approximationSupportedPlannedPlanned
Truncated geometric thresholdingSupportedSupportedSupported
Laplace thresholdingSupportedSupportedSupported
Gaussian thresholdingPlannedSupportedSupported

Implementations of the Laplace mechanism and the Gaussian mechanism use secure noise generation. These mechanisms can be used to perform computations that aren't covered by the algorithms implemented in our libraries.

The DP building block libraries are suitable for research, experimental or production use cases, while the other tools are currently experimental, and subject to change.

How to Build

In order to run the differential privacy library, you need to install Bazel in version 3.7.2, if you don't have it already. Follow the instructions for your platform on the Bazel website

You also need to install Git, if you don't have it already. Follow the instructions for your platform on the Git website.

Once you've installed Bazel and Git, open a Terminal and clone the differential privacy directory into a local folder:

git clone https://github.com/google/differential-privacy.git

Navigate into the differential-privacy folder you just created, and build the differential privacy library and dependencies using Bazel (note: ... is a part of the command and not a placeholder):

To build the C++ library, run:

cd cc
bazel build ...

To build the Go library, run:

cd go
bazel build ...

To build the Java library, run:

cd java
bazel build ...

You may need to install additional dependencies when building the PostgreSQL extension, for example on Ubuntu you will need these packages:

sudo apt-get install make libreadline-dev bison flex

Caveats of the DP building block libraries

Differential privacy requires some bound on maximum number of contributions each user can make to a single aggregation. The DP building block libraries don‘t perform such bounding: their implementation assumes that each user contributes only a fixed number of rows to each partition. That number can be configured by the user. The library neither verifies nor enforces this limit; it is the caller’s responsibility to pre-process data to enforce this.

We chose not to implement this step at the DP building block level because it requires some global operation over the data: group by user, and aggregate or subsample the contributions of each user before passing them on to the DP building block aggregators. Given scalability constraints, this pre-processing must be done by a higher-level part of the infrastructure, typically a distributed processing framework: for example, Privacy on Beam relies on Apache Beam for this operation.

For more detail about our approach to building scalable end-to-end differential privacy frameworks, we recommend reading our paper about differentially private SQL, which describes such a system. Even though the interface of Privacy on Beam is different, it conceptually uses the same framework as the one described in this paper.

Support

We will continue to publish updates and improvements to the library. We are happy to accept contributions to this project. Please follow our guidelines when sending pull requests. We will respond to issues filed in this project. If we intend to stop publishing improvements and responding to issues we will publish notice here at least 3 months in advance.

License

Apache License 2.0

Support Disclaimer

This is not an officially supported Google product.

Reach out

We are always keen on learning about how you use this library and what use cases it helps you to solve. We have two communication channels:

Please refrain from sending any personal identifiable information. If you wish to delete a message you've previously sent, please contact us.

Related projects

  • PyDP, a Python wrapper of our C++ DP building block library.
  • OpenDP tools for statistical analysis of sensitive private data.
  • TensorFlow Privacy, a library to train machine learning models with differential privacy.