CCNet
CCNet copied to clipboard
Can I not use torch.distributed?
After read the train.py and evaluate.py, I don't find a option that turn torch.distributed off.(I try to run your code in one GPU)
Actually, It's a bug of package Inplace-abn. It needs to run with the context of torch.distributed.
我也只有一张gpu,一张gpu是否可以直接用nn.BatchNorm2d
你好,请问单张gpu是不是可以直接用nn.BatchNorm2d,然后DataParallelModel(deeplab),,DataParallelCriterion(criterion)这两句都可以不要对吧
还有你这个代码当中的模型中输出两个值下,x和dsn,dsn用于辅助损失函数,像pspnet那样,但计算loss时并没有把dsn考虑进去