MPL-pytorch
MPL-pytorch copied to clipboard
Incorrect cross entropy?
https://github.com/kekmodel/MPL-pytorch/blob/7fb5b40cd53179bf4c09ef0f916815c3272d3e9d/main.py#L197
At this point hard_pseudo_label is a batch size by 1 array of integer indexes, not a one hot encoding.
Is this correct when calculating cross entropy?