UDA-Animal-Pose icon indicating copy to clipboard operation
UDA-Animal-Pose copied to clipboard

Descendant Accuracy for each batch

Open yaldashbz opened this issue 2 years ago • 3 comments

Hi,

I am using your dataset and your pretrained model and training mean-teacher with unchanged hyper-parameters. Still, the accuracy in the validation phase has a descendant graph for each batch (acc_re, which you print in a Bar), and then for the next batch, it suddenly jumps to the starting accuracy of the previous batch (It looks like a periodic graph). Could you please explain the reason?

Thanks.

yaldashbz avatar Sep 28 '22 16:09 yaldashbz

Hi @yaldashbz , I have not met this before, can you provide more details for the phenomenon?

chaneyddtt avatar Sep 29 '22 03:09 chaneyddtt

About the jumping after each batch I was wrong; sorry about that. But I still have a question. In the training phase, for each epoch after training, the accuracy is calculated in the validation method, and the accuracy graph has a descending form. (for example, in the second epoch, it starts from 1.0, 0.86, 0.81, ... tolerate between 0.69 and 0.78, and at the end goes to 0.70). Note that I'm not using mixup for training. I don't understand the reason.

I've attached the graph for ACC and the loss of a model trained for one epoch.

Screenshot from 2022-09-29 15-12-31

yaldashbz avatar Sep 29 '22 13:09 yaldashbz

We are computing the average accuracy over the test data during the validation. Only one batch of data is used for the first iteration, and the accuracy can be very high depends on the difficulty of the data in the first batch. As the iteration gets larger, the accuracy is computed over more batch of data, and the accuracy also changes.

chaneyddtt avatar Sep 30 '22 11:09 chaneyddtt