differential-privacy
differential-privacy copied to clipboard
Rust implementation of DP
It would be great to have an implementation of DP for Rust, ideally implemented in Rust natively, or, failing that, an FFI wrapper around the C++ implementation. We could use something like that for https://github.com/project-oak/oak .
cc @conradgrobler @rbehjati
Agreed, Rust would be great! I'm happy to discuss more, e.g., what functionality of the DP lib you need. As far as I understood, Oak might also use a local DP model, rather than the central DP model that is used in this repo.
At the moment we are using a central model based on the idea that we have a trusted runtime running inside a trusted execution environment that does the aggregation.
Having support for count, sum and Laplacian noise would be our most urgent need.
We have an initial implementation of differentially private metrics collection which supports event counting and summing of integers in defined ranges. It supports only Laplacian noise. The noise is implemented naively by sampling from a uniform distribution and using the inverse CDF. Based on reading https://github.com/google/differential-privacy/blob/main/common_docs/Secure_Noise_Generation.pdf I assume the current implementation is not secure.
For reference, the current implementation is at https://github.com/project-oak/oak/blob/main/oak_functions/loader/src/metrics.rs
In future we would like to expand the metrics collection features to support more aggregation types and Gaussian noise.
Closing this as we do not have any plans for a Rust implementation as of now.