nbro
nbro
@mcourteaux Thank you for this info (I was already suspecting this, btw). Have a look at the duplicate issue https://github.com/tensorflow/tensorflow/issues/36181. You should also provide this info there. Feel free to...
@brianwa84, @jvdillon Can you please confirm (or not) that the solution provided @mcourteaux is the most appropriate workaround that currently exists? I've trained a Bayesian neural network by early stopping...
@joaocaldeira Well, as I say in the comment above, I was trying to use the NLL computed with the workaround above to early stop my model, but the model early...
@joaocaldeira I think I did exactly the same thing. What's the size of your dataset and what problem are you trying to solve? I just want to understand if this...
@Strateus I used the solution described in the comment https://github.com/tensorflow/probability/issues/742#issuecomment-580433644 and it worked for me with TF 2. You need to import that function from `from tensorflow.python.keras.utils.tf_utils import is_tensor_or_variable` and...
> loss={'loss_1': negloglik, 'loss_2': MetricWrapper(negloglik, name='nll')} To be honest, I didn't fully read your traceback, but this line seems to suggest that you're not using `MetricWrapper` for the first loss....
@Strateus But you're using `negloglik` in both cases (i.e. the same loss), i.e. you're passing `negloglik` to `MetricWrapper` and using `negloglik` directly.
@Strateus But I am suggesting that you also use `MetricWrapper` for `loss_1` too, to avoid the error you describe above, otherwise, why would you need `MetricWrapper` in the first place...
@Strateus My question is: why don't you use `MetricWrapper` for `loss_1` too? That's what I've not yet understood.
@Strateus That's why I asked another question above: why do you use it for `loss_2` if it works directly then? I think there's a big misunderstanding here.