PyMAF
PyMAF copied to clipboard
Question about regressor's output in each iteration
Hi, once more thanks for your fantastic work! I noticed that your method have different regressors for each iteration, and the output from the previous iteration serves as the input to the following regressor. How do you prevent the output of the regressor from diverging (i.e., moving further away from GT)? According to what I understand, you saved the output of each iteration and then calculating the loss and back propagating in each batch. I wondered whether this would address the above issue. Thanks again!