PINNs-TF2.0 icon indicating copy to clipboard operation
PINNs-TF2.0 copied to clipboard

Prototype of FEM-PINNs

Open mtsokol opened this issue 4 years ago • 4 comments

Hi @pierremtb!

I'm a master degree student and recently I've been playing with PINNs source code and it's idea described in the paper. Amazing to see it rewritten to TF2.0! For me it was great way to learn GradientTape API.

I thought about introducing a small tweak to loss function as a way to merge FEM and NN for solving differential equations. I've spent on it last week I think I'm at a stage where I completed all I could think of (gradient is computed, error decreases) but still it does not solves it correctly and it's really slow.

I would be thankful if you could take a look at the implementation, I don't hold a lot of experience with Tensorflow so I could missed sth or the bare idea is faulty. Otherwise please let me know I'll close the PR. (I've opened PR for convenience - will close it).


Idea

With standard FEM method we would define our base functions and search for the coefficients that convey problem defined by differential equation and initial and border conditions. So u that we search for would be approximated by combination of coefficients and base functions. We would do it by creating and solving some system of equations.

In PINNs we train a NeuralNet that will behave as a direct solution for our u by placing equation in the loss function.

Here the idea is a combination of both. We want to train a NeuralNet that for given element (e.g. (i,j) for 2D problem) will return its coefficient. So the solution will be described by basis functions but its coefficients are acquired from NN. Here I decided to use BSplines as basis functions.

So the loss function won't change:

But our approximation of u:

And with our NN:

Code

In this PR I've tried to complete it as on paper it looked feasible. I've created a new Layer that computes u approximation (together with comments). I've used BSpline basis from tensorflow_graphics library.

For better and faster explanation I've prepared short notebook how that bspline implementation works together with plots (it's the same implementation so I used it for testing): https://github.com/mtsokol/PINNs-TF2.0/blob/FEM-PINNs-TF2.0/1d-burgers/BSplines.ipynb

The rest is all the same. It runs with python3 1d-burgers/inf_cont_burgers.py

Problems

I've made it in 40 lines of code and spent a bit of time on that but unfortunately I didn't make it work as expected. I've got to major issues:

  1. The error is decreasing but unfortunately all coefficients are basically pulled down close to zero (I thought about scaling output but learning still just pulls it down) so there is no sinus recreated as initial condition (but with those spline basis it's possible). What could I do to check it whether it's just impossible with such loss function or is it implementation error or maybe wrong RL, weights initialization, output scaling, etc.?
  2. It's really slow. For 1000 collocation points Adam is slow but LBFGS takes literally forever. In PR I'm using 100 colloc points: Adam -> 0.5 sec per 10 iter., LBFGS 35 sec. (!). and final sampling is also scaled down.

So here I think it's due to the way gradient is computed. In standard version we just multiply in input by layers, here for input we first compute coefficients from NN (easy) and then for each element we compute spline evaluation so there is few multiplication and also cond with also makes this gradient harder to compute.

I've tried to introduce vectorized_map or iter=10 in map_fn but both fail with different error that I couldn't solve.

How can I confirm whether it's such inefficient complexity of loss function or whether implementation issue?


Sorry once again for such long post.

What do you think about this idea? Does it make sense? Or maybe it's inefficient approach when it comes to complexity? How can I work on problems described? If you have any advises regarding implementation I would be thankful as I still learning Tensorflow.

mtsokol avatar Jul 26 '20 21:07 mtsokol

Hi Mateusz, thanks for reaching out! Unfortunately, I'm incredibly busy ATM, so is it fine if I take a deeper look at this later this week?

pierremtb avatar Jul 26 '20 21:07 pierremtb

@pierremtb Sure! Thank you once again for any help.

(p.s. in case for known problem with installing latest tensorflow_graphics dependency on mac here's solution https://github.com/AcademySoftwareFoundation/openexr/issues/567#issuecomment-590144773)

mtsokol avatar Jul 26 '20 21:07 mtsokol

@pierremtb Hi! Sorry for interrupting once again. Did you have time to look at it?

mtsokol avatar Aug 04 '20 09:08 mtsokol

Hey @mtsokol, I'm very sorry it must have fell through. Will try to dive in today or tomorrow!

pierremtb avatar Aug 04 '20 09:08 pierremtb