deepxde icon indicating copy to clipboard operation
deepxde copied to clipboard

Tensorflow 1.x backend: add dropout to DeepONet

Open vl-dud opened this issue 1 year ago • 8 comments

Now DeepONet supports dropout technique.

vl-dud avatar Dec 03 '23 18:12 vl-dud

Do you use dropbox to prevent overfitting?

lululxvi avatar Dec 04 '23 15:12 lululxvi

Do you use dropbox to prevent overfitting?

Yes, I'm using dropout_rate right now during hyperparameter tuning. I'll write how useful it is in my case.

vl-dud avatar Dec 04 '23 16:12 vl-dud

I am curious how useful it is. In general, I found dropout is not that useful, and L1/L2 regularization seems good enough.

lululxvi avatar Dec 04 '23 16:12 lululxvi

I am curious how useful it is. In general, I found dropout is not that useful, and L1/L2 regularization seems good enough.

Hyperparameter tuning showed that in my case neither dropout nor regularization is required. However I will use dropout anyway for UQ.

vl-dud avatar Apr 08 '24 11:04 vl-dud

Yes, dropout is useful for UQ. How do you implement DeepONet UQ?

lululxvi avatar Apr 08 '24 14:04 lululxvi

I just run model.predict many times to get the final prediction with CI:

def predict_with_uncertainty(model, x, trial_num=100):
    predictions = []
    for _ in range(trial_num):
        predictions.append(model.predict(x))
    return np.mean(predictions, axis=0), np.std(predictions, axis=0)

vl-dud avatar Apr 08 '24 15:04 vl-dud

In fact, we have this callback https://deepxde.readthedocs.io/en/latest/modules/deepxde.html#deepxde.callbacks.DropoutUncertainty . Does this work for your case?

lululxvi avatar Apr 08 '24 20:04 lululxvi

Yes, I used DropoutUncertainty. The above code snippet is more convenient in my case, since I use it with already trained models. In addition, it allows you to set trial_num, which is a constant in DropoutUncertainty.

vl-dud avatar Apr 09 '24 14:04 vl-dud