AngelBottomless
AngelBottomless
Example result ``` dataset loss: 0.1425575: 1%|▍ | 811/100000 [06:19
Sure, I'll do it within 2 hours
Now it will appear in cli as ``` dataset loss: 0.1511301: 0%| | 210/95901 [01:39
resolved merge conflict
``` Mean loss of 47 elements Training at rate of 1e-05 until step 100000 dataset loss:0.154±(0.026): 0%| | 35/100000 [00:16
But another story, afaik loss.backward() should remove set grad to None when its called, so ``` weights[0].grad = None loss.backward() if weights[0].grad is None: ``` this part would return always...
this is fixed with https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/3486
This error happens when medvram flag is on and when it tries to generate preview.
Here is some fixes and improvements : [#3486](https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/3486) 1. statistics.stdev only works if iterable contains more than 1 items 2. line 277 and 278 had problem, it should use `loss_info[key]`...
https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/3538/files#diff-d3503031ef91fb35651a650f994dd8c94d405fe8e690c41817b1d095d66b1c69R208 Actually you're having duplicate lines now, lines 208 and 215 , 293 should be removed. Except that it seems okay