Xander Steenbrugge

Results 21 comments of Xander Steenbrugge

Ok so the fix is to change two lines in training/dataset.py: line 112/113: switch the comment to: ``` #tfr_shapes.append(parse_tfrecord_np(record).shape) tfr_shapes.append(parse_tfrecord_np_raw(record)) ``` line 166/167: switch the comment to: ``` #dset =...

I've updated my bugfix, I accidentally posted the wrong change line here, sorry for that! Current version of the fix should work

It's been a while since I coded this up, very likely some things are a bit outdated by now.. I might actually remove the versioning numbers from the requirements file...

Are there any best practices for training with de-distilled models cause this is all one big jungle tbh. Would be awesome to have a collection of best practices / experiments...

Thank you Sanaria! I can now confirm that adding masks significantly improves results by maintaining better promptability. I'm currently using very basic prompt-based CLIPseg masks from my SDXL trainer which...

After a lot more testing i'll share some of my current results / opinions: - getting better results training on top of flux-dev-de-distill-diffusers vs Fluxdev2pro (inference works really well with...

> Thanks for your insights. > > I'm surprised you didn't find substancial inferencing improvements when training with CFG > 1 I will keep on training with de-distilled CFG =...

Hitting the same bug in my comfy deployments. Removing the gui dependency would be awesome if possible.

> I have just implemented `FLUX.1 dev` Textual Inversion within 20G VRAM. After completing training and testing, I will open the code, which may be helpful. Awesome, is this doing...

@Littleor I've tested your Ti training repo but havent had any successes (it wont learn my concept at all), is it possible there are bugs left in the implementation or...