addons icon indicating copy to clipboard operation
addons copied to clipboard

WeightNormalization not working with TensorBoard histograms

Open TWJubb opened this issue 4 years ago • 1 comments

System information

  • TensorFlow version 2.1.0
  • TensorFlow-Addons version 0.9.1
  • Python version 3.6
  • Is GPU used? yes

Describe the bug

I am using the WeightNormalization wrapper in a keras model and I want to monitor the weights and gradients using tensorboard; but I believe the use of this boolean parameter

https://github.com/tensorflow/addons/blob/master/tensorflow_addons/layers/wrappers.py#L104

which checks whether the layer has been initialised; is causing the tensorboard histogram generation to fail. I changed it to a tf.dtype.int32 with shape (1,) which solved the issue but this seems a bit hacky.

Code to reproduce the issue

I am using the model.fit_generator() function with

callbacks = [ tf.keras.callbacks.TensorBoard(write_grads=True,histogram_freq=1,log_dir="...")]

on a model which includes the WeightNormalization which trains fine without the tensorboard callback

(I don't think the write_grads is doing anything, but that's a separate issue)

here is the actual error message

InvalidArgumentError: Value for attr 'T' of bool is not in the list of allowed values: float, double, int32, uint8, int16, int8, int64, bfloat16, uint16, half, uint32, uint64
	; NodeDef: {{node WriteHistogramSummary}}; Op<name=WriteHistogramSummary; signature=writer:resource, step:int64, tag:string, values:T -> ; attr=T:type,default=DT_FLOAT,allowed=[DT_FLOAT, DT_DOUBLE, DT_INT32, DT_UINT8, DT_INT16, DT_INT8, DT_INT64, DT_BFLOAT16, DT_UINT16, DT_HALF, DT_UINT32, DT_UINT64]; is_stateful=true> [Op:WriteHistogramSummary] name: causal_conv2d_3/weight_normalization_42/initialized_0/

TWJubb avatar May 08 '20 12:05 TWJubb