Person-reID-triplet-loss
Person-reID-triplet-loss copied to clipboard
Why skip last batch?
Hi, Zedong, Thanks a lot for your work. I was a little confused about the following code: if now_batch_size<opt.batchsize: # next epoch continue Does it will influence the performance?
Hi @Tiamo666 Generally, it may not compromise the final performance. But in some cases, it does.
- For example, after you have trained most images in the training set, you may left one image. Using batch normalisation, one image will output zero and abnormal prediction.
- Small batchsize also will affect the running mean and std in the batchnorm layer.
@layumi thanks for your kind explanation, I got it. Another question is that in your implementation of triplet loss, I found that you get the negative hard samples is a little different. your input includes sample, target, pos, pos_target. And you just get hard negative from "pos" exclude sample, I was confused about that.
@Tiamo666
Sure. You can sample another batch as negative pool.
But it may be more efficient by directly using other positive
data as negative sample.
@layumi OK. And is it necessary to do the random permutation of "nf_data"(in train_new.py line ~217)? Cause your default opt.poolsize is 128(equal to the batchsize of nf_data), and after sorting score, the result will be the same.
@Tiamo666 It is designed for the small poolsize. If you using the biggest pool size, it is not necessary.
Hi, @layumi Thanks a lot