opacus
opacus copied to clipboard
Opacus uses unsafe floating-point noise generation, even with secure_mode=True
🐛 Bug
The _generate_noise
primitive has a secure_mode
parameter. When set to True, the documentation claims that the noise distribution is secure against floating-point attacks. This is wrong for two reasons.
- The approach taken, which relies on this paper, only defends against one specific attack. It still generates a distribution that has "holes" in the output space, creating distinguishing events that break DP. It just makes the exact positions of these "holes" a little harder to guess, but there is no exact quantification of how hard it would be under standard cryptographic assumptions.
- Floating-point attacks not only rely on flaws present in the noise generation primitive, but in the addition step as well. In particular, precision-based attacks are successful even when the noise generation primitive is perfect. In particular, diffprivlib, which relies on the same paper, is still vulnerable to such attacks.
Solutions to this vulnerability include using an approach based either on discretization (like what GoogleDP does), or on interval refining (like what Tumult Analytics does).
Additional context
As an aside, I do not understand why it makes sense to even have a secure_mode
parameter, especially if the default is False
. In which context does it make sense to use a DP library but not want the output to be actually DP?