RETRO-pytorch icon indicating copy to clipboard operation
RETRO-pytorch copied to clipboard

Retro-fitting a pretrained model

Open dean-sh opened this issue 2 years ago • 7 comments

Hey,

Thank you for your implementation! Is it possible to use your library to "retro-fit" a pretrained model?

I guess it would mean freezing the model during training, only fine-tuning the retrieval and cross-attention? How would you recommend doing that?

Thanks!

dean-sh avatar May 22 '22 15:05 dean-sh

I'm interested in this as well but I haven't had time to work on it. The original paper "retrofitted" T5 by adding additional cross-attentions between the pretrained model and the KB retrieval/chunk system. They claimed it only took a small number of training steps to teach the revised model to utilize the new cross-attentions. I'm assuming this involved training all model weights on a masking task the same way that was done in the original pretraining.

It shouldn't be too difficult to hack up the Huffingface model code to add the cross attentions and then use the information retrieval components from here. I'll probably try this sometime in the next few months. I'm more interested in the Bart model so I was planning to work on that, not T5. Let me know if you or someone else get to it first.

bjascob avatar Jun 02 '22 01:06 bjascob

Thank you for your implementation!

I'm interested in how would you add CCA to Bart, in encoder or decoder? If in encoder, CCA is causal, How would you recommend solving this. If in decoder, retrieval needs 64 token at least. If generated text less than 64 token, retrieval would not be used.

Thanks!

bling0830 avatar Jun 30 '22 02:06 bling0830

Has anyone of you worked on the retrofitting part yet?

saisurbehera avatar Nov 18 '22 20:11 saisurbehera

I haven't had the time and although I'm still somewhat insterested, realistically I probably won't get to this.

It might be worth emailing the authors of the original paper to see if they'd be willing to post that code or provide additional information on the retrofitting process. As I recall, there was only a paragraph or so on it. Seems like there's a number of details it would be helpful if they could provide.

bjascob avatar Nov 18 '22 21:11 bjascob

Yup, let me email them and hopefully they respond.

saisurbehera avatar Nov 21 '22 17:11 saisurbehera

I just got a no response from them

saisurbehera avatar Nov 22 '22 00:11 saisurbehera

Hey there, anyone had any time to work on this?

Misterion777 avatar Nov 23 '23 14:11 Misterion777