ESRGAN
ESRGAN copied to clipboard
CPU Device Segfaults
Using 'cpu' instead of 'cuda' causes a segfault. CUDA 10.0.130, PyTorch 1.0.0 (latest versions)
Not sure where any other information regarding the crash would be located.
Hi @Beinsezii I tested with CPU mode (with PyTorch 1.0) and there is no crash. BTW, if you have cuda, why not try using GPU mode?
A 1000x1000 image uses more than my 8 gigabytes of VRAM. Main area I use fancy up-scaling is for converting <=1080p images to better fit newer (4k) screens. It's not really a big deal since I wrote a simply python command line utility that uses PIL and Imagemagick to split an into smaller strips and recombine them, which therefore lets me process basically any image with CUDA. Since its not reproducible on your end, later I could try playing around with some stuff to see if anything bites.
A 1000x1000 image uses more than my 8 gigabytes of VRAM. Main area I use fancy up-scaling is for converting <=1080p images to better fit newer (4k) screens. It's not really a big deal since I wrote a simply python command line utility that uses PIL and Imagemagick to split an into smaller strips and recombine them, which therefore lets me process basically any image with CUDA. Since its not reproducible on your end, later I could try playing around with some stuff to see if anything bites.
Any chance you could share this utility? I'm about to do something similar, but this would save some effort. Thanks!
Any chance you could share this utility? I'm about to do something similar, but this would save some effort. Thanks!
Uhhhh I guess. I've never shared code before and this script in particular is rather incomplete, so it's kinda archaic with almost no documentation or error processing, but here you go.
File in my cloud drive --Github doesn't want me sharing a .py file. Mildly ironic.
It's a command-line utility, so you use it like you would any other. argparse generates help info, so image_split.py -h
or something should work. I'm booted in Windows instead of Arch right now so I can't test it myself.
Needs Python 3.6+, imagemagick, and Python Imaging Library (PIL)
Any chance you could share this utility? I'm about to do something similar, but this would save some effort. Thanks!
Uhhhh I guess. I've never shared code before and this script in particular is rather incomplete, so it's kinda archaic with almost no documentation or error processing, but here you go.
File in my cloud drive --Github doesn't want me sharing a .py file. Mildly ironic.
It's a command-line utility, so you use it like you would any other. argparse generates help info, so
image_split.py -h
or something should work. I'm booted in Windows instead of Arch right now so I can't test it myself. Needs Python 3.6+, imagemagick, and Python Imaging Library (PIL)
That works quite well, thanks!
I encountered a segmentation fault as well, but not on smaller images like the sample baboon. I don't know the precise cut-off point, but 710 x 443 seems to be sufficiently large to trigger it.
It's unlikely to be a RAM issue, because it doesn't exceed ~1.2 GB (and I have a total of 32).
$ python3 test.py models/RRDB_ESRGAN_x4.pth
Model path models/RRDB_ESRGAN_x4.pth.
Testing...
1 ultima7
/home/frans/.local/lib/python3.7/site-packages/torch/nn/modules/upsampling.py:129: UserWarning: nn.Upsample is deprecated. Use nn.functional.interpolate instead.
warnings.warn("nn.{} is deprecated. Use nn.functional.interpolate instead.".format(self.name))
Segmentation fault
@Beinsezii
--Github doesn't want me sharing a .py file. Mildly ironic.
It'll work as an archive of various sorts: image_split.py.tar.gz
But yeah, otherwise they want you to use a gist, I guess.
I encountered a segmentation fault as well, but not on smaller images like the sample baboon. I don't know the precise cut-off point, but 710 x 443 seems to be sufficiently large to trigger it.
It's unlikely to be a RAM issue, because it doesn't exceed ~1.2 GB (and I have a total of 32).
I have this exact same error and tried to troubleshoot for the past few days with no success. Any help would be appreciated!