awesome-semantic-segmentation-pytorch icon indicating copy to clipboard operation
awesome-semantic-segmentation-pytorch copied to clipboard

Semantic Segmentation on PyTorch (include FCN, PSPNet, Deeplabv3, Deeplabv3+, DANet, DenseASPP, BiSeNet, EncNet, DUNet, ICNet, ENet, OCNet, CCNet, PSANet, CGNet, ESPNet, LEDNet, DFANet)

Results 113 awesome-semantic-segmentation-pytorch issues
Sort by recently updated
recently updated
newest added

I run the train with "python -m torch.distributed.launch --nproc_per_node=4" and using dataset=cityscape backbone=resnet50 batchsize=4 my GPU is Nvidia Titan X *4 12GB memory per card, and CUDA out of memory...

大神你好,我是个初学者,在用voc训练psp的时候默认设置可以跑通,但是我想把Ci弄小一点,于是改成了: class _PSPHead(nn.Module): def __init__(self, nclass, norm_layer=nn.BatchNorm2d, norm_kwargs=None, **kwargs): super(_PSPHead, self).__init__() self.psp = _PyramidPooling(512, norm_layer=norm_layer, norm_kwargs=norm_kwargs) self.block = nn.Sequential( nn.Conv2d(1024, 128, 3, padding=1, bias=False),#Ci,Co,kernelsize norm_layer(128, **({} if norm_kwargs is None...

amax@amax:/data/yh/awesome-semantic-segmentation-pytorch/scripts$ ./dfanet_resnet18_pascal_voc.sh /home/amax/anaconda3/lib/python3.6/site-packages/requests/__init__.py:80: RequestsDependencyWarning: urllib3 (1.25.3) or chardet (3.0.4) doesn't match a supported version! RequestsDependencyWarning) 2019-08-21 14:15:23,071 semantic_segmentation INFO: Using 1 GPUs 2019-08-21 14:15:23,071 semantic_segmentation INFO: Namespace(aux=False, aux_weight=0.4, backbone='resnet18', base_size=520,...

Please,How to solve this problem? RuntimeError: CUDA out of memory. Tried to allocate 74.00 MiB (GPU 0; 1.96 GiB total capacity; 1.46 GiB already allocated; 71.50 MiB free; 38.53 MiB...

2019-11-20 11:16:10,422 semantic_segmentation INFO: Iters: 880/139700 || Lr: 0.000099 || Loss: 0.0033 || Cost Time: 0:06:46 || Estimated Time: 17:46:33 2019-11-20 11:16:15,154 semantic_segmentation INFO: Iters: 890/139700 || Lr: 0.000099 ||...

> There might be something wrong with the version of `pillow`. Try this : `border=(0, 0, padw, padh)` -> `border=20` Thank you very much, I solved it. But I don't...

attention_new = torch.max(attention, dim=-1, keepdim=True)[0].expand_as(attention) - attention I don't understand

wenwu@Amax:~ /chenyao/awesome-semantic-segmentation-pytorch-master/scripts$ python train.py --model fcn32s --backbone vgg16 --dataset pascal_voc --lr 0.0001 --epochs 50 2021-09-08 18:06:48,524 semantic_segmentation INFO: Using 1 GPUs 2021-09-08 18:06:48,524 semantic_segmentation INFO: Namespace(aux=False, aux_weight=0.4, backbone='vgg16', base_size=520, batch_size=4,...

ggw@vip-workstation1:~/guoanXu/awesome/core/data/downloader$ python ade20k.py --download-dir ../datasets/citys Traceback (most recent call last): File "ade20k.py", line 51, in download_ade(_TARGET_DIR, overwrite=False) File "ade20k.py", line 35, in download_ade makedirs(download_dir) File "/home/ggw/guoanXu/awesome/core/utils/filesystem.py", line 16, in makedirs...