Llama3.1-Finetuning icon indicating copy to clipboard operation
Llama3.1-Finetuning copied to clipboard

对llama3进行全参微调、lora微调以及qlora微调。

Results 3 Llama3.1-Finetuning issues
Sort by recently updated
recently updated
newest added

您好,我在执行qlora微调复现时遇到这个问题,报错信息是: Traceback (most recent call last): File "../finetune_llama3.py", line 452, in train() File "../finetune_llama3.py", line 445, in train trainer.train() File "/home/nlpir/miniconda3/envs/cjy_llama/lib/python3.8/site-packages/transformers/trainer.py", line 1624, in train return inner_training_loop( File "/home/nlpir/miniconda3/envs/cjy_llama/lib/python3.8/site-packages/transformers/trainer.py", line...

一直报错fused_adam的错误,我的设备是双卡2080ti,版本和requirement一样 FAILED: multi_tensor_adam.cuda.o /usr/bin/nvcc -DTORCH_EXTENSION_NAME=fused_adam -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -I/home/patrickstar/miniconda3/envs/pytorch/lib/python3.10/site-packages/deepspeed/ops/csrc/includes -I/home/patrickstar/miniconda3/envs/pytorch/lib/python3.10/site-packages/deepspeed/ops/csrc/adam -isystem /home/patrickstar/miniconda3/envs/pytorch/lib/python3.10/site-packages/torch/include -isystem /home/patrickstar/miniconda3/envs/pytorch/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -isystem /home/patrickstar/miniconda3/envs/pytorch/lib/python3.10/site-packages/torch/include/TH -isystem /home/patrickstar/miniconda3/envs/pytorch/lib/python3.10/site-packages/torch/include/THC -isystem /home/patrickstar/miniconda3/envs/pytorch/include/python3.10 -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_75,code=compute_75 -gencode=arch=compute_75,code=sm_75 --compiler-options...

![image](https://github.com/taishan1994/Llama3-Finetuning/assets/156107697/ec8a28a9-125e-4a23-9414-767bd0450297) 佬我把sh脚本里的模型改成了没有Instruct过的llama,但是运行torchrun的时候报错了,这个要怎么解决呀