privacy icon indicating copy to clipboard operation
privacy copied to clipboard

ValueError when selecting DP optimizer minibatch size > 1 for LSTM

Open zredlined opened this issue 3 years ago • 0 comments

I'm running into a problem and hoping you all can help. I'm using a Keras LSTM Model with tf-privacy, and when I select a num_microbatches size greater than one in the optimizer definition, I get the following error:

Describe the expected behavior: I should be able to select a num_microbatches size > 1

def loss(labels, logits):
  return tf.keras.losses.sparse_categorical_crossentropy(labels, logits, from_logits=True)

optimizer = DPKerasAdamOptimizer(
                  learning_rate=0.0001,
                  l2_norm_clip=2.75,
                  noise_multiplier=.15,
                  num_microbatches=1)

model.compile(optimizer=optimizer, loss=loss)

Describe the current behavior: Any choice for the num_microbatches param > 1 results in this error:

ValueError: Dimension size must be evenly divisible by 16 but is 1 for '{{node Reshape}} = Reshape[T=DT_FLOAT, Tshape=DT_INT32](loss/weighted_loss/value, Reshape/shape)' with input shapes: [], [2] and with input tensors computed as partial shapes: input[1] = [16,?].

System information: GCP, running on Tesla V100, Ubuntu, 8 vcpu, TF 2.4.0 and tf-privacy 0.5.2.

Standalone code to reproduce the issue https://gist.github.com/zredlined/72305ab04670197869e470b232d22ed4

Notes: From the examples section, it looks like we need to compute the vector of per-example losses, rather than its mean over a minibatch. But, I'm unable to get the loss function (or any of the loss functions in the examples) to work. Any help would be appreciated!

# Compute vector of per-example loss rather than its mean over a minibatch.
loss = tf.keras.losses.CategoricalCrossentropy(
    from_logits=True, reduction=tf.losses.Reduction.NONE)

^^ I thought that setting the reduction to None would return the per-example losses that the dp optimizer is looking for, but I get an error ValueError: Shapes (16, 100) and (16, 100, 65) are incompatible

zredlined avatar Jun 29 '21 03:06 zredlined