diffusers
diffusers copied to clipboard
stable diffusion adapter pipeline for t2i adapter missing `from_single_file`
Is your feature request related to a problem? Please describe.
I want to use and test out the t2i adapter with my safetensor model. I used to use diffusers models, but none work since the update. Many of the other adapters use from_single_file to load safetensors correctly. I keep bringing this up, but the current design on having multiple ways to load files is terrible since many adapters seem to not inherit loading methods.
Describe the solution you'd like. To be able to load safetensors files from a local directory with the stable diffusion adapter pipeline for t2i. from_single_file is missing and it's annoying that from_pretrained can't do it either.
Describe alternatives you've considered. Automatic1111, I just want to test with a specific model. I don't have time to waste on other models.
Additional context. I keep saying this over and over, but making all the loading methods load both safetensors and diffusers.
can you provide a reproducible code example?
I'm not sure why I even need to give an example when I pointed out that from_single_file is missing from the pipeline.
But from modifying a line in the example code here: https://huggingface.co/docs/diffusers/en/training/t2i_adapters
Using diffusers 0.27.2
from diffusers import StableDiffusionXLAdapterPipeline, T2IAdapter, EulerAncestralDiscreteSchedulerTest
from diffusers.utils import load_image
import torch
# modifed line below doesn't work, doesn't have this method like most of the other pipelines/etc
adapter = T2IAdapter.from_single_file("path/to/adapter.safetensors", torch_dtype=torch.float16)
pipeline = StableDiffusionXLAdapterPipeline.from_single_file(
"path/to/model.safetensors", adapter=adapter, torch_dtype=torch.float16
)
pipeline.scheduler = EulerAncestralDiscreteSchedulerTest.from_config(pipe.scheduler.config)
pipeline.enable_xformers_memory_efficient_attention()
pipeline.enable_model_cpu_offload()
control_image = load_image("./conditioning_image_1.png")
prompt = "pale golden rod circle with old lace background"
generator = torch.manual_seed(0)
image = pipeline(
prompt, image=control_image, generator=generator
).images[0]
image.save("./output.png")
cc @DN6 here - do we plan to support T2IAdapter?
Yeah makes sense to add single file support for T2IAdapter. @JemiloII Do you have an example hosted checkpoint we could use to test? And would you be interested in adding the functionality to T2IAdapter?
I don't think there's any T2IAdapter that is not in the diffusers format. Probably what OP is doing is just using one from the UIs where they use the diffusers ones but have hardcoded the config so they only distribute the safetensor file.
For example, the T2I-Adapter for Lineart:
https://civitai.com/models/136070?modelVersionId=155414
which downloads a controlnetxlCNXL_tencentarcLineart.safetensors file but in the description they state that is the one released here: https://huggingface.co/TencentARC/t2i-adapter-lineart-sdxl-1.0
Not all the ones I've found even have a link back to huggingface.co some just have the safetensor file and that's it. I'm not sure why each class in the diffusers library have their own load_from_etc methods instead of inheriting them. This way if someone makes a new adapter or even if their is an older adapter, it can use newer load methods. Still personally would rather have a single load function that didn't care about which format it was in and would check the format to internally call the right way.
single-file t2i adapters are in the official repo: https://huggingface.co/TencentARC/T2I-Adapter/tree/main/models
and some (but not all) are available in diffuser-folder-style. for example: https://huggingface.co/TencentARC/t2iadapter_canny_sd15v2/tree/main is equivalent to https://huggingface.co/TencentARC/T2I-Adapter/blob/main/models/t2iadapter_canny_sd15v2.pth
there are definitely more single-file adapters than diffuser-folder-style, so this would definitely be welcome.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
ping so bot does not close this as stale, this is still a very valid request.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
ping