deepxde
deepxde copied to clipboard
Dump out external trainable variable for external optimizer
closing #805 #395
This merge request includes the following changes:
- If an external optimizer function (e.g. L-BFGS) is used, it was not outputting or writing the external trainable variables after each epoch or in the training end.
In the training end
part was forgotten to implement. Now one can output the external trainable variables after each epoch and at the very end as well - One example is tested if the algorithm works as expected
Format your code via black
@lululxvi I could not get why the tests could not pass? Is it because of formatting?
@lululxvi I could not get why the tests could not pass? Is it because of formatting?
No. It seems an issue of TensorFlow installation in the Travis system.
@lululxvi I could not comment on the file but for line 562 in model.py I have a comment.
.
.
.
self.train_state.set_data_train(*self.data.train_next_batch(self.batch_size))
self.train_state.set_data_test(*self.data.test())
self._test()
self.callbacks.on_train_begin() # Line 562
.
.
.
Line 562 also repeats the last epoch of the previous optimizer if more than one optimizer is used. I think it is not considered when it is implemented and it should be skipped if at least two optimizers are used.
Actually, line 562 is not needed at all. Because it is the initial guess before optimization starts. Do we really need it?
Line 562 also repeats the last epoch of the previous optimizer if more than one optimizer is used. I think it is not considered when it is implemented and it should be skipped if at least two optimizers are used.
Actually, line 562 is not needed at all. Because it is the initial guess before optimization starts. Do we really need it?
Good point. I suggest keeping it, as the initial values are useful and we may tune the initial values in some cases. It is a future work to resolve this repeated output between two optimizers.
I have another remark regarding on outputting of the loss values in case of an external optimizer is used. Fetches contain only the training loss so it does not include the test loss. I think we should fetch the test losses as well right? It was like this before, but I just realized it while plotting losses
I have another remark regarding on outputting of the loss values in case of an external optimizer is used. Fetches contain only the training loss so it does not include the test loss. I think we should fetch the test losses as well right? It was like this before, but I just realized it while plotting losses
Yes, you are right.
I have another remark regarding on outputting of the loss values in case of an external optimizer is used. Fetches contain only the training loss so it does not include the test loss. I think we should fetch the test losses as well right? It was like this before, but I just realized it while plotting losses
Yes, you are right.
Done.