KEPLER
KEPLER copied to clipboard
Transform fairseq model to huggingface's transformers model
I changed the convert code in line 56 from
roberta_sent_encoder = roberta.model.encoder.sentence_encoder
to
roberta_sent_encoder = roberta.model.decoder.sentence_encoder
However another error occurred:
AttributeError: 'MultiheadAttention' object has no attribute 'k_proj'
I'm not sure if we should modify the convert code first according to the kepler code and this is the right way to convert.
My transformers version is 2.0.0, and fairseq 0.9.0