Aaron2117
Aaron2117
> The author must have trained more than 30,000 steps. The following is the result of my 30,000 steps:           ...
@elenakovacic 300 epochs. i run inference on train set, the results is ok, but on test set is bad
@neuralchen i train 300 epochs on VITON-HD data set. the image size is 512 x 384, and the loss is about 0.03. can you tell me what your final loss...