LTA
LTA copied to clipboard
Seen_N in train
Hi, nice work! I have read code, and i have a question. In batch sampler, i see we actual use remain classes as seen class , why not use seen_N?
class MNFBatchSampler(ClassBaseBatchSampler):
# classes for unseen
class_idxs = rand_classes[:self.unseen_N]
for i, c in enumerate(self.classes[class_idxs]):
sample_idxs = torch.randperm(self.num_per_class[c])[:self.unseen_K]
unseen_classes.append(int(c))
unseen_query += self.indexes[c][sample_idxs].int().tolist()
# classes for remain
seen_class_idxs = rand_classes[self.seen_N:] # why? seen_N not work at all ?
for i, c in enumerate(self.classes[seen_class_idxs]):
sample_idxs = torch.randperm(self.num_per_class[c])[:self.seen_K]
seen_classes.append(int(c))
seen_query += self.indexes[c][sample_idxs].int().tolist()
Maybe you lost "-" :
# classes for remain
seen_class_idxs = rand_classes[-self.seen_N:] ```
Yeah, you are right.. I forgot to test this Class when I rebuilt the code. I will fix it soon!
Yeah, you are right.. I forgot to test this Class when I rebuilt the code. I will fix it soon!
thank U!