Alexander Kryuchkov

Results 12 comments of Alexander Kryuchkov

Probably, it is related with #89

The latest PyPi package contains this issue.

I have the same issue with RTX 2080. NVIDIA CUDA 11.0.3 cuDNN 8.

I have the same issue because diffuser has explicit exception: https://github.com/huggingface/diffusers/blob/79c0e24a1442741c59c9b1d2764ba4dbfe56ac71/src/diffusers/models/attention_processor.py#L162

I found the follow solution with torch 1.x: load a model with 8bit text encoder and do not use xformers at all stages except upscale stage. For 8bit loading you...

Try to replace dependency from the list above. It solved the problem.

Issue probably in the xformers library: https://github.com/facebookresearch/xformers

It is working with follow list of dependencies and no `torch.autocast`: ```yaml channels: - defaults dependencies: - python=3.9 - pip - pytorch::cudatoolkit=11.3 - pytorch::pytorch==1.12.1 - pytorch::torchvision==0.13.1 - numpy - ninja...

The same isssue