semsim
semsim copied to clipboard
Is it possible to finetune pre-trained weights for custom dataset rather than fine-tuning Bart checkpoint??
Is it possible to finetune model with semsim.pt
weights on custom dataset? rather than fine-tuning Bart checkpoint?
Hi Ranjeet, Yes, I think it will work well, but we haven't tried it yet officially.
In case it does not work well, I would recommend :
- First, let BART learn structures of the new dataset by fine-tuning (with cross-entropy loss) on your dataset for a few epochs.
- Transfer the checkpoint from 1. and fine-tune with SemSim approach a few epochs more to make the model performs better.
I am pretty interested to see how it will work on other tasks or datasets.
Thanks!
@icml-2020-nlp Let me try both the approaches and see how they perform. Will post results here on this issue once done.