Prithivida
Prithivida
Leverage the "inference_on" parameter, and updated it to make it more intuitive now for multi-GPU usage. -1 is reserved for CPU and ```0 through 998 is reserved for GPUs``. The...
> @pratikchhapolika It sounds like you'll need to fire up a separate process for each GPU and pass in `inference_on=0`, `inference_on=1`, `inference_on=2`, and `inference_on=3`, respectively, using `multiprocessing`. > > @PrithivirajDamodaran...
I haven't gotten around to doing that. You can watch this repo, when I do add that feature you will get notified :-)
 `Now I getting a different issue file/image not found. Under '/content/dalle-pytorch-pretrained/' there is no folder by the name DALLE-pytorch`
>  > > > > `Now I getting a different issue file/image not found. Under '/content/dalle-pytorch-pretrained/' there is no folder by the name...
Screenshot the issue.
I neither have a windows machine nor have a specific Ubuntu machine to test this.
For the Ubuntu user this might be relevant - https://stackoverflow.com/questions/32595050/sudo-pip-install-python-levenshtein-failed-with-error-code-1?answertab=trending#tab-top Please google, I am sure others have faced issues with levenshtien.
I added a new demo notebook, and everything works fine. (check for the link in Readme)
I added a new demo notebook, and everything works fine. (check for the link in Readme)