deepxde icon indicating copy to clipboard operation
deepxde copied to clipboard

The loss value is very low, but the predicted value is incorrect

Open Tomandjob opened this issue 2 years ago • 7 comments

Dear lulu, I solved a simple partial differential equation using deepxde, but the predicted result is several orders of magnitude different from the true result. I modified the output of the network, but it did not work.

def pde(x, y): dy_xx = dde.grad.hessian(y, x) A=xy dA_x=dde.grad.jacobian(A, x) return 0.5dy_xx + dA_x

def func(x): return np.exp(-x*x)/(math.sqrt(np.pi))

def boundary_l(x, on_boundary): return on_boundary and np.isclose(x[0], -2.2)

def boundary_r(x, on_boundary): return on_boundary and np.isclose(x[0], 2.2)

geom = dde.geometry.Interval(-2.2, 2.2) bc1 = dde.icbc.DirichletBC(geom, lambda x: 0, boundary_l) bc2 = dde.icbc.DirichletBC(geom, lambda x: 0, boundary_r)

data = dde.data.PDE(geom, pde, [bc1,bc2], 8000, 2, solution=func, num_test=1000)

True value: image

Predictive value: image

Tomandjob avatar Mar 23 '22 14:03 Tomandjob

See FAQ "Q: I failed to train the network or get the right solution, e.g., large training loss, unbalanced losses." and "Q: Implement new losses or constraints."

lululxvi avatar Mar 25 '22 22:03 lululxvi

See FAQ "Q: I failed to train the network or get the right solution, e.g., large training loss, unbalanced losses." and "Q: Implement new losses or constraints."

I reviewed the code. I didn't put the normalization conditions (∫(u)dx=1) in the code. I put “0.05*u(x)-1” into the pde. The predicted shape is the same as the actual shape, but the value is several times different. I changed the output, but it didn't work. I referred to Volterra_IDE, but still didn't know how to write the integral into pde.

Tomandjob avatar Mar 26 '22 13:03 Tomandjob

@Tomandjob First of all, please do not paste codes with errors. A=xy is not valid statement in Python. return 0.5dy_xx + dA_x is also not valid.

Also, when you paste a code use this tool image. It helps is preserving the indentation.

I looked into it. This is what I get when I run the same stuff. The predictions are zero. 5000_epoch Which is correct. It satisfies the BC as well as the PDE at each point.

Here, is the loss stat:

Training model...

Step      Train loss                        Test loss                         Test metric
0         [1.58e-02, 2.15e-02, 2.15e-02]    [1.58e-02, 2.15e-02, 2.15e-02]    []  
1000      [9.75e-08, 2.80e-09, 1.14e-10]    [9.75e-08, 2.80e-09, 1.14e-10]    []  
2000      [6.61e-08, 3.77e-09, 1.71e-08]    [6.61e-08, 3.77e-09, 1.71e-08]    []  
3000      [1.98e-07, 4.25e-07, 1.19e-08]    [1.98e-07, 4.25e-07, 1.19e-08]    []  
4000      [5.41e-06, 6.38e-06, 7.66e-06]    [5.41e-06, 6.38e-06, 7.66e-06]    []  
5000      [3.61e-07, 3.65e-07, 3.83e-07]    [3.61e-07, 3.65e-07, 3.83e-07]    []  

Best model at step 2000:
  train loss: 8.69e-08
  test loss: 8.69e-08
  test metric: []

'train' took 83.282071 s

DeepXDE always works on well-posed 1D problems. Are you sure the problem is well-posed i.e. the solution is unique?

praksharma avatar Mar 30 '22 06:03 praksharma

@praksharma Thank you for your reply. The right codes should be A=x*y and 0.5*dy_xx + dA_x.

The predicted solution of the neural network is very small and fluctuates around 0. The code is missing a normalization conditions (∫ydx=1). I put “0.05*y-1” into the pde. The shape of the predicted value curve is the same as the true value curve,but the predicted value is many times larger than the real value.

Tomandjob avatar Mar 30 '22 06:03 Tomandjob

@Tomandjob I don’t think the implementation of the normalisation condition is correct.

praksharma avatar Mar 30 '22 10:03 praksharma

@praksharma Yes, this is an approximate method. I don't know how to define this condition precisely yet.

Tomandjob avatar Mar 30 '22 11:03 Tomandjob

@Tomandjob For the normalization, see https://github.com/lululxvi/deepxde/issues/174

lululxvi avatar Mar 30 '22 13:03 lululxvi