T2I-Adapter
T2I-Adapter copied to clipboard
T2I-Adapter
Hello, thank you for this great repo, i have question about using style button stuff in your demo, how can we use them: cinematic, 3D model, photographic...
cv2.imread returns None for the wrong path.
Hello. Thank you so much for this amazing new tech. In this today recorded video I have shown how to install from scratch and use T2I-Adapter style transfer Other features...
这是我的代码: `from diffusers import StableDiffusionXLAdapterPipeline, T2IAdapter, EulerAncestralDiscreteScheduler, AutoencoderKL,MultiAdapter from diffusers.utils import load_image, make_image_grid from controlnet_aux.midas import MidasDetector import torch from controlnet_aux.canny import CannyDetector device='cuda:0' # load adapter depth_midas_adapter = T2IAdapter.from_pretrained(...
Hi, Really nice work. The results look pretty good. I was looking to train a custom color theme adapter and wondering if you there is any plan to update a...
3984451294 0%| | 0/20 [00:00
This is the most important ControlNet in my pipeline so I'm surprised to see no T2I adapter has been trained with softedge, neither SD 1.5 nor SDXL. Any particular reason...
I encountered ‘torch.nn.parallel.DistributedDataParallel hang on’ problemwhen I run the train_depth.py. I found that the program cannot enter the statement "dist._verify_model_across_ranks"  How to solve this problem
I found the LAION-Aesthetics have not style label, how to collect style references