nicolai256

Results 57 comments of nicolai256

> Nice results! I've been running some experiments as well. I've upped the class images from the suggested 8 to about 100. It seems to generalize better. > > I...

I think this has to do with the config inference file you're using, try using the one from the inpaint repo?

try a different learning rate (03) and num_vectors a little higher maybe :)

> will do. what do the different embeddings files/numbers refer to? is it the different stages of training/steps? yeah

they're the masks for the outpainting of the images

I'm training on 24gb vram, so 30gb vram is not a requirement atm it doesn't work for 8gb vram, maybe someone who has knowledge of optimisation will change the script...

> File "E:\ModelTraining\ldm\modules\attention.py", line 180, in forward sim = einsum('b i d, b j d -> b i j', q, k) * self.scale RuntimeError: CUDA out of memory. Tried to...

No, --init_word overwrites the .yaml setting

When using 2 words u use .yaml file ["word1","word2"] and don't use --init_word

U don't have to train for days btw