LTA icon indicating copy to clipboard operation
LTA copied to clipboard

Seen_N in train

Open wwx13 opened this issue 2 years ago • 3 comments

Hi, nice work! I have read code, and i have a question. In batch sampler, i see we actual use remain classes as seen class , why not use seen_N?

       class MNFBatchSampler(ClassBaseBatchSampler):
            # classes for unseen
            class_idxs = rand_classes[:self.unseen_N]   
            for i, c in enumerate(self.classes[class_idxs]):
                sample_idxs = torch.randperm(self.num_per_class[c])[:self.unseen_K]
                unseen_classes.append(int(c))
                unseen_query += self.indexes[c][sample_idxs].int().tolist()

            # classes for remain
            seen_class_idxs = rand_classes[self.seen_N:]    # why?   seen_N not work at all ?
            for i, c in enumerate(self.classes[seen_class_idxs]):
                sample_idxs = torch.randperm(self.num_per_class[c])[:self.seen_K]
                seen_classes.append(int(c))
                seen_query += self.indexes[c][sample_idxs].int().tolist()

wwx13 avatar Aug 19 '22 06:08 wwx13

Maybe you lost "-" :

# classes for remain
seen_class_idxs = rand_classes[-self.seen_N:] ```

wwx13 avatar Aug 19 '22 06:08 wwx13

Yeah, you are right.. I forgot to test this Class when I rebuilt the code. I will fix it soon!

Quareia avatar Aug 19 '22 10:08 Quareia

Yeah, you are right.. I forgot to test this Class when I rebuilt the code. I will fix it soon!

thank U!

wwx13 avatar Aug 19 '22 10:08 wwx13