embedding-propagation
embedding-propagation copied to clipboard
Question about label propagation during finetuning
In https://github.com/ServiceNow/embedding-propagation/blob/master/src/models/finetuning.py#L103 , why is the query matrix multiplied by nclasses .
Shouldn't it start from zeros matrix since no labels are available for queries .. and labels are propagated from neighbors?
Hi @Saurabh7,
This is just to flag these labels as "unlabeled". As you can see here: https://github.com/ServiceNow/embedding-propagation/blob/c51e7ac591459052b9c56b1fe1c8d450b3d90b3d/embedding_propagation/embedding_propagation.py#L67 they are skipped since the dimension "nclasses" in the one_hot vector is removed. (0...01 -> 0...0)