AngelBottomless
AngelBottomless
https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/3538/files#diff-d3503031ef91fb35651a650f994dd8c94d405fe8e690c41817b1d095d66b1c69R214 its still remaining mhm
Circular dependencies is not good. Yes, In this case we have common supermodules, so you can extract those functions and put into `textual_inversion.py`.
I think you're now missing `statistics` and `report_statistics` import in hypernetwork side, can you confirm some screenshot that both are working?
Its working good, I'm getting proper results with 2500 epoch tests. And yes, last layer activation, especially ReLU, was blocking it - but sometimes it worked, which means its doing...
Sorry for inconvenience, I totally forgot it.
Okay I checked the code, apparently "linear": torch.nn.Identity, would add additional Linear layer instead of leaving it empty, which will break old HNs. If Linear was created with that function,...
In https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/3771 https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/3771/commits/f361e804ebaa5af4a10711ece2522869fb64a4c6 fixes it. @nekoyama32767 @benkyoujouzu if keyError is happening : it's because HN were created without skipping linear, somehow without this line ` if activation_func == "linear" or...
do your image process cycle exactly matches with dataset length? also using fixed seed generation and settings? I agree that we need independent rng for shuffling, since its being fixed...
Can you use new Random object from python random, with fixed seed generation, then permutate from it? We'll need to make reproducable results, i.e. fix seeds used for result to...
simply ``` from random import Random, randrange # in __init__ self.seed = randrange(1