ControlNet
ControlNet copied to clipboard
style transfer not wanted
Hi, I trained controlnet with depth on my own blender dataset and found that the pose is really good, but the artificial style of my blender images is also a heavy influence in the resulting model (style transfer), which I do not want. They should look realistic instead. Is there any way to improve that? Greetings, Matthes
how you generate prompts?
(Since I was too lazy for annotation,) I put the prompt "object in front of background" for everything image in the train set (6000 images with various backgrounds). While inference, I do not mention any of the tokens in the prompt (in order to not tempt the model to transfer the style, I thought). Instead, I just put the actual thing that I want to create (like "car") as a prompt. The result is that the pose is perfect, but style transfer is an issue, as mentioned. Do you think proper prompts are the solution? And why?
if training prompts are better then cn wont learn to change style of sd because the style is already given in training prompts
ok, I'll try it thank you
Hi @tensorflowi
Can you please share the inference script you are using?
Thanks in advance.
if training prompts are better then cn wont learn to change style of sd because the style is already given in training prompts
Can I solve this issue by finetuning the text encoder?