privacy icon indicating copy to clipboard operation
privacy copied to clipboard

Library for training machine learning models with privacy for training data

Results 140 privacy issues
Sort by recently updated
recently updated
newest added

Traceback (most recent call last): File "D:\Anaconda2\envs\tf1\lib\site-packages\tensorflow_core\python\framework\ops.py", line 1607, in _create_c_op c_op = c_api.TF_FinishOperation(op_desc) tensorflow.python.framework.errors_impl.InvalidArgumentError: Dimension size must be evenly divisible by 250 but is 1 for 'training/Reshape' (op: 'Reshape')...

Hi all, In DP-SGD, given a constant number of minibatches, let us say 256, I was wondering why increasing the number of microbatches has a bad effect on the performance....

Is tensorflow-privacy compatible with tensorflow-gpu1.4 ?

Trying to import `tensorflow_privacy` I get the following error. My tensorflow version is 1.14.0 and I am trying to import the command on a jupyter notebook. I install the library...

Some algorithms like DP Federated Averaging need to support weighted DP means. The "weight" could be an argument to preprocess_record, but not all DPQueries need to support weighted records. An...

How to return a vector loss in Keras Custom Loss function, So that I don't have a issue with the following error: ValueError: Dimension size must be evenly divisible by...

I'm running the vectorized tutorial (tf-gpu 1.14) and the per epoch training time on GPU (Nvidia Tesla M10) is slower than the non-vectorized implementation: GPU Vectorized : 203.55 seconds/epoch Non-vectorized...

Is there any guidance how to choose parameters in conjunction with the Keras ImagedataGenerator? I experience InvalidArgumentError during the last quarter of the 1st epoch. I guess the issue is...

Example was outdated. Fixed to use keras L2 regularizer.

cla: yes

I am trying to add privacy to my WGAN. With Adam optimizer without DP, my WGAN works well. But after I change it to optimizer = DPAdamGaussianOptimizer( l2_norm_clip=3, noise_multiplier=0.5, num_microbatches=1,...