ydata-synthetic
ydata-synthetic copied to clipboard
[FEAT]Remove generator vars from gradient of the Supervised loss in TimeGAN (as loss does not depend on it)
(This is not a problem in the algorithm per-se, but more a matter of code clarity. So I do not tag it as a bug)
In the definition of the train_supervisor
method:
https://github.com/ydataai/ydata-synthetic/blob/1279e5e1001efe9c4ee53f27ef9561141f9d6bc9/src/ydata_synthetic/synthesizers/timeseries/timegan/model.py#L159-L169
the generator variables are added to var_list
. This is not needed, as generator_loss_supervised
does not depend on these variables (so gradient will be 0 and no update on these parameters will take place). This is probably inherited from the original TimeGAN
implementation.
If I am correct, I suggest removing these variables from var_list
, for better code clarity.