cleverhans
cleverhans copied to clipboard
Bug in TF2 PGD implementation
Describe the bug
In future.tf2.attacks.projected_gradient_descent
, the sanity check clip_min<= x <= clip_max
is implemented incorrectly: here you are appending a tensor to asserts
, and a subsequent call of np.all
on asserts
invoked the __bool__
cast of the tensor, which raises the "The truth value of an array with more than one element is ambiguous" error.
To Reproduce
Use the following code:
import tensorflow as tf
import numpy as np
from cleverhans.future.tf2.attacks import projected_gradient_descent
def model_fn(x):
return tf.nn.softmax(x, axis=-1)
X = [[1., 2., 3.], [4., 5., 6.]]
print(projected_gradient_descent(
model_fn, X, 1., 1., 5, np.inf, 0., 7.))
Expected behavior Code shouldn't crash.
System configuration
- Ubuntu 18.04.3
- Python 3.6.0
- TensorFlow 2.0.0
I have implemented a TF2 version for PGD attack in my fork project. The issues should be fixed.
https://github.com/CNOCycle/cleverhans/tree/feature/tf2.x