hellostronger
hellostronger
have seen exist issue written in March,but i cannot get any useful info to find out why this error came,hoping your suggestion
@hiyouga accelerate==0.28.0 bitsandbytes==0.43.0 ,Do these versions have any problems?hoping your suggestion
@hiyouga sorry,my answer is so late this case, using newest llama_factory code, it work currently right now
@miaozhongjian have fixed? I use torch2.1.2 cuda 121 py3.8 transformers4.39.3 ,throw the same error ValueError: Cannot flatten integer dtype tensors
热切期待中
请问一下微调数据后续会开源吗
期待数据集开源