CrossAttentionControl icon indicating copy to clipboard operation
CrossAttentionControl copied to clipboard

Implement with Stable Diffusion repositories

Open ExponentialML opened this issue 2 years ago • 4 comments

Hello, thanks for the implementation! It works very well.

As a suggestion, it may be helpful to provide code that works with the broader Github community. While the Diffusers library does allow for better ease of use and a more streamlined experience, it can possibly hinder the freedom to use across similar implementations due to how their library works.

ExponentialML avatar Sep 10 '22 23:09 ExponentialML

I believe the easiest way for someone to get started is to simply download the diffusers library and run this jupyter notebook. While it does require the diffusers library, it is not using any of the pre-implemented pipelines, diffusers is only used to download and load the neural network with schedulers, all of the diffusion logic is in the jupyter notebook.

If you are using another library, simply replace the unet, vae and scheduler variables with the ones from another library. I might be wrong but I think the cross attention modules should be implemented similarly, but maybe you have to modify slightly the code injection as the variable names might be different.

bloc97 avatar Sep 11 '22 00:09 bloc97

Hello, thanks for the implementation! It works very well.

As a suggestion, it may be helpful to provide code that works with the broader Github community. While the Diffusers library does allow for better ease of use and a more streamlined experience, it can possibly hinder the freedom to use across similar implementations due to how their library works.

I reproduced cross attention control based on the stable diffusion repository referring to this repository. However, I'm not sure the codes reproduced properly. Refer to my repository: https://github.com/sunwoo76/CrossAttentionControl-stablediffusion

@bloc97, your repository, and comments were good guidance to me. Could you check my implementation too? Thanks :)

sunwoo76 avatar Sep 15 '22 15:09 sunwoo76

@sunwoo76 Awesome! I didn't look very deeply in the code but it looks good at first glance from reading the README. I think showing a visualization of the attention maps is a very good feature!

bloc97 avatar Sep 15 '22 16:09 bloc97

Hello, thanks for the implementation! It works very well. As a suggestion, it may be helpful to provide code that works with the broader Github community. While the Diffusers library does allow for better ease of use and a more streamlined experience, it can possibly hinder the freedom to use across similar implementations due to how their library works.

I reproduced cross attention control based on the stable diffusion repository referring to this repository. However, I'm not sure the codes reproduced properly. Refer to my repository: https://github.com/sunwoo76/CrossAttentionControl-stablediffusion

@bloc97, your repository, and comments were good guidance to me. Could you check my implementation too? Thanks :)

Amazing work!

ExponentialML avatar Sep 15 '22 17:09 ExponentialML