ContrastiveSeg icon indicating copy to clipboard operation
ContrastiveSeg copied to clipboard

How to incorporate loss function into my model?

Open 202041600047 opened this issue 2 years ago • 3 comments

Thank you very much for your work, I have a question: my model is BiSeNetV2, I want to replace my loss function with your pixel contrast loss function, but my output has only one value, and the preds of the pixel contrast loss function is a dictionary, and the keys are 'seg' and 'embed', I don't know what 'seg' and 'embed' mean, and I don't know how to get these two values ​​in BiSeNetV2?

class ContrastCELoss(nn.Module, ABC): def forward(self, preds, target, with_embed=False): h, w = target.size(1), target.size(2) assert "seg" in preds assert "embed" in preds seg = preds['seg'] embedding = preds['embed']

202041600047 avatar Mar 17 '22 11:03 202041600047

@202041600047 Thanks for your interests. For the seg, it is the output of the model; while for the embed, you can obtain it through a convolutional projection head of your backbone model. Sorry for that I am not very familiar with the structure of BiSeNetV2, and cannot tell the specific layer that you can add the projection head.

tfzhou avatar Mar 18 '22 18:03 tfzhou

@202041600047 Thanks for your interests. For the seg, it is the output of the model; while for the embed, you can obtain it through a convolutional projection head of your backbone model. Sorry for that I am not very familiar with the structure of BiSeNetV2, and cannot tell the specific layer that you can add the projection head.

Thank you very much for your busy schedule to answer my question. I have another question to ask you: How to get embedding? The output of my model has only one value. I can put the output directly into self proj_ Head () method? Replace def forward(self, x_, with_embed=False, is_eval=False): x = self.backbone(x_) embedding = self.proj_head(x[-1]) with def forward(self, x_, with_embed=False, is_eval=False): x = self.backbone(x_) #x has only one value and self. Backbone is using my own network BiSeNet embedding = self.proj_head(x) Can I do this?That is to say, I don't know what the parameters in embedding should be?

202041600047 avatar Mar 28 '22 14:03 202041600047

Have you solved the problem of loading the new model? I have the same problem

heifanfanfan avatar Mar 24 '23 07:03 heifanfanfan