deepxde icon indicating copy to clipboard operation
deepxde copied to clipboard

Custom optimizer in deepxde

Open uttamsuman opened this issue 1 year ago • 1 comments

Is it possible to write a custom optimizer in deepxde. @lululxvi If yes how?

uttamsuman avatar Jan 17 '24 05:01 uttamsuman

Yes sure. First clone the DeepXDE repository and create/ copy a python file for your problem in the root directory of the repo.

Open the optimizers.py file in your repo. Here I assume that we are making a custom optimiser in Pytorch. After line 32 you can see a lot of optimisers called based on the optimiser you selected in the model.compile() of your code.

Simply add your optimiser class on the top of the optimiser.py file. It should look like this.

class CustomOptimiser(torch.optim.Optimizer): 
    # Init Method: 
    def __init__(self, params, lr=1e-3, momentum=0.9):
        # Your optimiser here.
    def step(self):
        # You step function

then modify the DeeXDE's optimiser.py as follows:

elif optimizer == "custom":
            optim = CustomOptimiser(
                params, lr=learning_rate, weight_decay=weight_decay
            )

You can refer to Torch's docs on how to make custom optimiser. To be honest I never needed a custom optimiser.

You can use your optimiser in your python file as follows:

model.compile("custom", lr=0.001)

praksharma avatar Feb 26 '24 13:02 praksharma