deepxde
deepxde copied to clipboard
Tensorflow 1.x backend: add dropout to DeepONet
Now DeepONet supports dropout technique.
Do you use dropbox to prevent overfitting?
Do you use dropbox to prevent overfitting?
Yes, I'm using dropout_rate right now during hyperparameter tuning. I'll write how useful it is in my case.
I am curious how useful it is. In general, I found dropout is not that useful, and L1/L2 regularization seems good enough.
I am curious how useful it is. In general, I found dropout is not that useful, and L1/L2 regularization seems good enough.
Hyperparameter tuning showed that in my case neither dropout nor regularization is required. However I will use dropout anyway for UQ.
Yes, dropout is useful for UQ. How do you implement DeepONet UQ?
I just run model.predict
many times to get the final prediction with CI:
def predict_with_uncertainty(model, x, trial_num=100):
predictions = []
for _ in range(trial_num):
predictions.append(model.predict(x))
return np.mean(predictions, axis=0), np.std(predictions, axis=0)
In fact, we have this callback https://deepxde.readthedocs.io/en/latest/modules/deepxde.html#deepxde.callbacks.DropoutUncertainty . Does this work for your case?
Yes, I used DropoutUncertainty
. The above code snippet is more convenient in my case, since I use it with already trained models.
In addition, it allows you to set trial_num
, which is a constant in DropoutUncertainty
.