semanticGAN_code
semanticGAN_code copied to clipboard
Official repo for SemanticGAN https://nv-tlabs.github.io/semanticGAN/
Dear author: 能提供下Chest X-ray Lung Segmentation的推理代码吗,想学习一下
Thanks for sharing your code. I have a question concerning the composition of the DermQuest dataset used in the paper. The official DermQuest labeled dataset repository (https://uwaterloo.ca/vision-image-processing-lab/research-demos/skin-cancer-detection) contains 137 labeled...
Could you please clarify how to organize the dataset together with the metadata files (e.g "train_full_list.txt", "val_full_list.txt", "unlabel_list.txt",...)?
dataset
Could you please upload these files "unlabel_list.txt","train_full_list.txt", "val_full_list.txt", "unlabel_list.txt'' "? I want to know their details,thanks.
Does anyone ever try to train this work via multi-gpu? I've tried single node with multi-gpu and multi-node via slurm. But I always got the issue of `RuntimeError: One of...
Can anyone please confirm whether the reported results are on a different test set or is it same as the validation set?
Dear researchers, please also consider our newly introduced dynamic-pix2pix architecture, which increse the modeling abbility of pix2pix specially in exteremely limmited data scenarios. For more information: https://www.researchgate.net/publication/365448869_Dynamic-Pix2Pix_Noise_Injected_cGAN_for_Modeling_Input_and_Target_Domain_Joint_Distributions_with_Limited_Training_Data Thanks
Hi, your project semanticGAN_code(commit id: 342889ebbe817695c0e64133100ede8f9877f3de) requires "albumentations==0.5.2" in its dependency. After analyzing the source code, we found that the following versions of albumentations can also be suitable, i.e., albumentations...
Hello I am working with datasets that contain more labeled images than unlabeled. In my experiments with default settings, it seems that the generator struggles to learn the input, and...
Hi there, Thank you so much for your code and paper. Very impressive work. I just want to reproduce your result, and so far so good. But when I want...