CyberVy

Results 3 comments of CyberVy

I got the same issue. ```python from diffusers import FluxTransformer2DModel import torch # The single file model is from https://civitai.com/api/download/models/1413133?type=Model&format=SafeTensor&size=full&fp=fp16 # When torch_dtype is bf16, it works, the RAM usage...

I've found an easy way to solve this issue. We can check the dtype before loading the model with ```.from_single_file```, and then pass it into ```.from_single_file```. ```python from diffusers.loaders.single_file_utils import...

Yes, it's overly strict. And sometimes there is no way to avoid setting states synchronously within a useEffect.