XQ

Results 9 comments of XQ

for sub_t, sub_loss in zip(ts.cpu().numpy(), values.detach().cpu().numpy()): quartile = int(4 * sub_t / diffusion.num_timesteps) logger.logkv_mean(f"{key}_q{quartile}", sub_loss) 就是反向推理恢复原图时从0到T步,中间抽了几次计算损失

Did you try using v2? I see that the structure diagram of v2 seems to be multi classified, but I don't seem to find any specific modifications in the code...

I have made similar adjustments on V1 before, but in the end, I found that the loss calculation predicted noise (which can be modified to predict x0), but the predicted...

For the separation of channel numbers, I don't know why using `torch.split()`is different from directly using `model_output [:,: 0,:,:]`, but I can only specify the classification ratio using `model_output, model_var_values=th.split...

The results are indeed very poor, and the binary classification performance is also almost poor on my dataset, with poor fine-grained performance, far lower than the common U-NET network. It...

> I did not try to add channels to do multi-segmentation but tried to generate multiple pixel values representing different classes in the last channel of the current code. >...

据说其他人要10w个step才差不多

训练为什么要删除lora,那不就是正常SAM了吗