EMNLP2017_DOC
EMNLP2017_DOC copied to clipboard
Training with Negative Samples will Harm the Performance Significantly
I tried to mix some negative samples when training my classifier under your framework. However, the validation accuracy drops from 94% to 61%.
Should I never do that?
Is it possible to modify the loss function to cope with negative samples?