PandaGPT
PandaGPT copied to clipboard
Training Stage
Thank you for your excellent work. Since there's no mention of the training stage, does that mean 160k data is trained on LLM using LoRA and llama projection layer at the same time? If that is the case, do I expect the save_to_modules: [llama projection] or you just simply turn the projection layer require gradients as true?