Nerdy Rodent
Nerdy Rodent
Recognised the error, but took a bit of documentation diving to figure out how to get clip-as-a-service to use something other than ViT-B/32. Basically, create a .yml file like this...
Yes, Runtime -> Restart Runtime is correct.
Sounds fun. A bit like story mode, but more interactive.
When I tested with my hardware, PyTorch 1.8.1 with CUDA 11.1 wass faster without jit than using 1.7.1 with.
I'm on a 3090 and so can't use CUDA 10 for tests, but my guess is that improvements in CUDA 11.1 outweigh any benefits from jit + CUDA 11.0
Have you installed CUDA? e.g. `conda install pytorch torchvision torchaudio cudatoolkit=11.1 -c pytorch -c conda-forge` Ref: https://pytorch.org/get-started/locally/
This uses 21,880 VRAM for me: ```python scripts/txt2img.py --prompt "An alien landscape, vector art" --n_samples 1 --n_iter 1 --H 704 --W 768 --ddim_steps 50 --outdir outputs/txt2img-samples/largest```
Nice! I ended up making a 96GB swapfile. Still eats RAM when running inference though :|
I'm using Shivam's DB and I don't get that blotchy look unless I put the guidance scale above 30 during inference (or above 10 on something like LMS). All the...
Been testing using my face all day with a variety of optimisers because reasons :) Things learnt so far - * The defaults seem very good so far, with 800-1800...