NewNLPer

Results 3 comments of NewNLPer

I am currently working on this issue. I have previously dealt with incremental pre-training for llama, and the key to the problem is the function of data_collator processing.

Yes, I have completed it, but some unexpected situations have occurred and the model has experienced catastrophic forgetting. If it is not particularly necessary, I do not recommend doing so.

Yes, I need to reiterate that what I am doing is incremental training. Unlike fine-tuning, the purpose of fine-tuning is to make LLMs understand instructions, but the purpose of incremental...