Aghiles Kebaili
Aghiles Kebaili
Then why do they put stats comparing Resnets and ViT ?
So you mean that the output of the model should be a softmax and not an embedding ? i fixed the output of my ViT to 512 thinking that what...
You say logits is the probabilities of the different classes, but according to this line ``target_logit = logits[index, labels[index].view(-1)]`` in arcface loss function, logits are 2 or more dimensional vector....
ok so if i understood, logits are classes probabilities: lets suppose 1000 classes for example logits.shape = (batch_size, 1000) ? and labels is the embedding ? why did they call...
First of all, i wanna thank you for the time you're giving to answer me, especially since the timezone is different between us haha. I thought Partial FC (which i...
Thank you, and yes i agree that it's better to read the paper. But the thing is that i don't have much time haha, i had a deadline so i...
The 0/1 flip is confusing yes. One simple solution is to just generate random Bernoulli values at probability (1 - drop_prob) instead of drop_prob. The function torch.bernoulli takes the positive...
Yeah problem fixed. Actually i'm training on 1x128x128 BraTS images and i forgot to put a torch.no_grad(): during reverse process. However, i still have an issue with the reverse process....
No, unfortunately not yet !
Clearly, no one wants to answer to this question lol