Subject-Diffusion icon indicating copy to clipboard operation
Subject-Diffusion copied to clipboard

Subject-Diffusion:Open Domain Personalized Text-to-Image Generation without Test-time Fine-tuning

Results 9 Subject-Diffusion issues
Sort by recently updated
recently updated
newest added

Thank you for this very exciting project! I see the script for generating images using pretrained checkpoints, but I don't see the checkpoints. Can you please provide the checkpoints and...

Dear Author, Thank you for your outstanding work. I have noticed that the data_process.py script uses two BLIP models, namely “blip-image-captioning-large” and “blip2-opt-2.7b”. May I ask which one you used?

hi, thanks for your excellent work here! I am reading the code and a little bit confused by the image_embeddings_cls in the training_step. The attention layer accurately takes the **image_embeddings**...

Hi it seems the huggingface/diffuser version in yaml need to be updated. I simply changed to huggingface-hub==0.13.2 and it worked. The conflict is caused by: The user requested huggingface-hub==0.11.0 diffusers...

Thanks for your great work and sharing your code !!! When i run the code , model is in cpu , not in GPU how to solve it? I mantually...

I have problem below : File "/home/yons/SH100k/Subject-Diffusion-main/train.py", line 880, in trainer.fit(model, datamoule) File "/home/yons/anaconda3/envs/subject-diffusions/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py", line 696, in fit self._call_and_handle_interrupt( File "/home/yons/anaconda3/envs/subject-diffusions/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py", line 650, in _call_and_handle_interrupt return trainer_fn(*args, **kwargs) File "/home/yons/anaconda3/envs/subject-diffusions/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py",...

are you planning to release the pretrained checkpoints?

Hello, do you have a specific reference code for the indicator dino-i? The similarity between the real images I calculated is above 0.9, and I am looking forward to your...

amazing work, I wonder when the training data will be release