PINA icon indicating copy to clipboard operation
PINA copied to clipboard

PINN variants addition and Solvers Update

Open dario-coscia opened this issue 11 months ago • 5 comments

New Added Solvers

  • [x] GPINN solver class gradient enhanced pinn paper link
  • [x] CausalPINN solver class causal pinn paper link
  • [x] SAPINN solver class self adaptive pinn paper link
  • [x] CompetitivePINN solver class self adaptive pinn paper link
  • [x] ReducedOrderModel solver class paper link

Modifying existing solvers

  • GAROM, SupervisedSolver, SolverInterface adding the on_train_start method for checks on dataloader
  • SupervisedSolver adds a loss_data method which defines the computation of the loss function. User can inherit from this class to create data-driven solvers (e.g. ReducedOrderModelSolver)

Miscellaneous Adding tests and modifying documentation. Now each solver is documented and explained in its mathematical formulation.

dario-coscia avatar Mar 15 '24 17:03 dario-coscia

Hi @ndem0 @guglielmopadula 👋🏻

Lately, with @annaivagnes @valc89 and myself, we have been working to modify existing bugs in PINA solvers and create new solvers. Here I tried to summarize the most relevant feature of this pull request.

Let us know if you are satisfied with the changes, we could also think of adding a comparative tutorial for physics-informed based solvers if interested! Thanks🚀😃

dario-coscia avatar Apr 30 '24 17:04 dario-coscia

Hi! Exceptional work! I have only three observations:

  1. From the SAPINN paper, it seems to me that there is a different weight for every point of every condition. In your implementation however it seems that there is only one weight for every condition, so points belonging to the same condition have the same weight. Is this correct?
  2. I personally don't like a long sequence of commented lines of code for todos (talking about the tests now), maybe we can remove the commented lines and add a new issue/discussion.
  3. I can't find the ReducedOrderSolver in the modifications. I am pretty sure however that it was there when I checked for the first time. I am simply blind because of the late hour or it has been removed? Thank you.

guglielmopadula avatar May 03 '24 21:05 guglielmopadula

Hi! Exceptional work! I have only three observations:

  1. From the SAPINN paper, it seems to me that there is a different weight for every point of every condition. In your implementation however it seems that there is only one weight for every condition, so points belonging to the same condition have the same weight. Is this correct?
  2. I personally don't like a long sequence of commented lines of code for todos (talking about the tests now), maybe we can remove the commented lines and add a new issue/discussion.
  3. I can't find the ReducedOrderSolver in the modifications. I am pretty sure however that it was there when I checked for the first time. I am simply blind because of the late hour or it has been removed? Thank you.

Thanks for spotting point (3), I added ROM solver (idk what happened). Regarding point 2 we left it commented as it was, once GPU tests will be activated on Github we will remove the comments. For point one we do follow the convention that we have one point for each residual point, we do this in on_train_start

    def on_train_start(self):
        """
        This method is called at the start of the training for setting
        the self adaptive weights as parameters of the mask model.

        :return: Whatever is returned by the parent
            method ``on_train_start``.
        :rtype: Any
        """
        device = torch.device(
            self.trainer._accelerator_connector._accelerator_flag
        )
        for condition_name, tensor in self.problem.input_pts.items():
            self.weights_dict.torchmodel[condition_name].sa_weights.data = torch.rand(
                (tensor.shape[0], 1),
                device = device
            )

Of course, if someone changes the point during training (but why one should do it with SAPINN?) by a callback everything breaks (so maybe we can think about something smarter @valc89 ?) A warning in the documentation its a good start anyway.

dario-coscia avatar May 04 '24 09:05 dario-coscia

I'm trying to fix the local imports and we're done.

ndem0 avatar May 06 '24 15:05 ndem0

@ndem0 Can we merge?

dario-coscia avatar May 10 '24 10:05 dario-coscia