ABSA-PyTorch icon indicating copy to clipboard operation
ABSA-PyTorch copied to clipboard

About fine-tuning

Open GW-S opened this issue 5 years ago • 1 comments

hello , In your paper ,we can see BERT embedding uses the pre-trained BERT to generate word vectors of sequence. In order to facilitate the training and fine-tuning of BERT model, we transform the given context and target to “[CLS] + context + [SEP]” and “[CLS] + target + [SEP]” respectively. can you tell me ,how you fine-tuning BERT model? use BERT ML model?

GW-S avatar Dec 20 '19 11:12 GW-S

Pre-trained BERT module is a submodule of your BERT-based model, so all the parameters are tuned together.

https://github.com/songyouwei/ABSA-PyTorch/blob/6ba6e040c8bc7aa9e7294905d3254a3f79e46caf/train.py#L38-L39

songyouwei avatar Dec 25 '19 03:12 songyouwei