Dhruv Nair
Dhruv Nair
@ddpasa It's possible to load the Flux transformer individually from the the file. ```python from diffusers import FluxTransformer2DModel model = FluxTransformer2DModel.from_single_file("./flux1-dev-fp8.safetensors", torch_dtype=torch.float8_e4m3fn) ``` Just FYI, diffusers doesn't handle automatic casting...
I think the fine tune would be just for the transformer (finetuning the T5 model is not very common). So I think you should be able to safely load just...
Hi @ddpasa yes we would auto download the model config when using single file. Additionally, what version of `transformers` are you using to load the model?
Sorry missed this. Yes can merge. Failing tests are unrelated.
Oh wait this was already fixed with this? https://github.com/huggingface/diffusers/blob/a4c1aac3ae10172f4acb8eaf83aac7f1f6e02ab0/tests/models/test_attention_processor.py#L88 PR: https://github.com/huggingface/diffusers/pull/10359
Hi @samadwar do you have a single file version of Wan Animate we can use to test this PR?