xtuner
xtuner copied to clipboard
An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
- [x] This PR addresses a TODO in the `tokenize_ftdp_datasets.py` file. Solution: I used the `lstrip` method to remove all leading newline characters from the `content` string. This way, we...
Currently the trained llava model can only be used by CLI (without the ability to use new images) or tested using benchmark tools. How can we deploy it using API...
加载模型 & Chat 用例:`xtuner/model/auto.py` 训练 alpaca ``` # 把 alpaca 数据集转换为 openai 格式 json python xtuner/tools/convert_dataset.py tatsu-lab/alpaca alpaca --save-dir converted_alpaca xtuner train xtuner/configs/internlm/internlm2_chat_1_8b/example.py ```
Mixtral 8x7B ZeRO3 : 110GB Mistral 7B ZeRO3: 15GB
多机训练
great work! 请问如果使用多节点进行训练,有对应的指导文档吗?没有找到相关文档