raf
raf
No worries, in the end I managed to setup the forked hypatams into my locale. Learned a lot from it. Smart system
> A pull request is meant to be when you want the changes you made into your own copy of the repository(the fork) to be also made into the main...
Hey @gshawn3 i noticed kohya merged the PR https://github.com/kohya-ss/sd-scripts/pull/1285#event-12857774925 Tried training in Kohya_ss with sd-scripts set to dev, but didnt see parameters passing. Does it work for you?
Just to be sure, you are using the PR Hyperparams or the latest Dev branch? Ok, I understand, running kohya_ss with ss-scripts in set dev branch remote, and running the...
Hi @gshawn3 , pardon me for the simple question. - I switched to sd-scripts remote on dev branch - I ran with --log_config in CLI with toml: `(venv) E:\kohya_ss>E:\kohya_ss\venv\Scripts\accelerate.EXE launch...
> Hi, yes it is working correctly for me: > >  > > Note that I'm running the training sd-script directly from the command line, where I added the...
@gshawn3 Confirming it works fine with --log_config in cli with toml file. All sorted. Thanks again
@gshawn3 hey. I noticed "Additional Arguments" is not passing to wandb. It just shows '[]' whether there is arguments or not. Know if there is a bug or how to...
@ccharest93 thats interesting. But "Optimiser Args" does pass. It's a pity, so arbitrary. Who could change this? Is this a kohya-ss thing?
@ccharest93 i am also noticing that the annealing I am passing (`--lr_scheduler_type CosineAnnealingLR --lr_scheduler_args T_max=510`) is not actually being passed to the training itself. I wonder if there is a...