faster-rcnn.pytorch
faster-rcnn.pytorch copied to clipboard
confuse about sampler and dataloader
Hi, thanks for sharing your nice code. But I feel confused about your sampler and the getitem function in roibatchLoader.py
for sampler in trainval_net.py:
def iter(self): rand_num = torch.randperm(self.num_per_batch).view(-1, 1) * self.batch_size self.rand_num = rand_num.expand(self.num_per_batch, self.batch_size) + self.range self.rand_num_view = self.rand_num.view(-1)
I don't get the point what is the difference of the above three lines from directly use torch.randperm(train_size)
for the getitem function in roibatchLoader.py for lines 165 and line 175:
padding_data = torch.FloatTensor(int(np.ceil(data_width / ratio)),
data_width, 3).zero_()
padding_data = torch.FloatTensor(data_height,
int(np.ceil(data_height * ratio)), 3).zero_()
It seems that, nothing change to the padding_data, because data_width / ratio=data_height and data_height * ratio = data_width, which means the padding_data has the completely same size as data.
More than that, I try to print index in the getitem function. I find that, the function is called 3*batch_size times. Take batch_size as example:
index 37198 size torch.Size([1, 600, 732, 3]) padding_datasize torch.Size([3, 600, 732]) index 37199 size torch.Size([1, 600, 732, 3]) padding_datasize torch.Size([3, 600, 732]) image index tensor([37198, 37199]) index 135064 size torch.Size([1, 600, 1095, 3]) padding_datasize torch.Size([3, 600, 1095]) index 135065 size torch.Size([1, 600, 1095, 3]) padding_datasize torch.Size([3, 600, 1095]) index 490 size torch.Size([1, 1064, 600, 3]) padding_datasize torch.Size([3, 1064, 600]) index 491 size torch.Size([1, 1064, 600, 3]) padding_datasize torch.Size([3, 1064, 600])
It shows the confusion I have said above. Could you or anyone has any idea about that? Thanks
@jwyang Hi, I have the same question. Anyone has any idea about that? Thanks