awesome-semantic-segmentation-pytorch
awesome-semantic-segmentation-pytorch copied to clipboard
why " iteration=iteration+1" in train.py ?
In train.py, the method 'train()' of Trainer has such sentence: 'iteration = iteration + 1', while it is in a loop of 'enumerate()'. So is it redundant?
`
def train(self):
save_to_disk = get_rank() == 0
epochs, max_iters = self.args.epochs, self.args.max_iters
log_per_iters, val_per_iters = self.args.log_iter, self.args.val_epoch * self.args.iters_per_epoch
save_per_iters = self.args.save_epoch * self.args.iters_per_epoch
start_time = time.time()
logger.info('Start training, Total Epochs: {:d} = Total Iterations {:d}'.format(epochs, max_iters))
self.model.train()
for iteration, (images, targets, _) in enumerate(self.train_loader):
iteration = iteration + 1
self.lr_scheduler.step()
images = images.to(self.device)
`