LLaMA-Factory
LLaMA-Factory copied to clipboard
Support for glm-4v-9b with mllm_plugin.
🎉 Support for glm-4v-9b with mllm_plugin
.
Fixes #4375
📝 Submission Checklist
- [x] Did you read the contributor guideline?
- [x] TODO: Done eval tests for this PR.(Limited)
- [x] TODO: Done SFT tests for this PR.
notes:
-
Eval test: Some padding issues may encounter due to
self.training
inmodeling_chatglm.py
.