lit-llama icon indicating copy to clipboard operation
lit-llama copied to clipboard

Training on GPT-4-LLM

Open TheGermanEngie opened this issue 2 years ago • 0 comments

So I cut down part of the GPT-4-LLM dataset from Microsoft Research to match the file size of tloen/alpaca-lora's json file. It's about 51% of the total GPT-4-LLM .json size (43.4 MB) but I'm assuming it can be used to train lit-llama due to its identical size. I then formatted it appropriately.

If I were to use a 48GB card or two 24GB cards, would it be possible to finetune 7B (or 13B) on the full GPT-4 dataset with LoRA?

TheGermanEngie avatar Jul 11 '23 01:07 TheGermanEngie