etemiz

Results 12 comments of etemiz

While I am trying to train https://huggingface.co/cognitivecomputations/dolphin-2.9-llama3-70b I am getting the same error "ValueError: Cannot flatten integer dtype tensors". The error seems to be resolved when I reinstalled LLaMA-Factory again....

I got the same error when I tried `--use_dora True`.

nostr.ch did PoW! So returning like pow:need 30 for this IP Then the client has to parse the string to find 30?

This can be useful also for IPs in a relay sync use case.

I added use_dora: true to the yaml. It said "Cannot flatten integer dtype tensors": ![Screenshot from 2024-06-11 20-15-50](https://github.com/hiyouga/LLaMA-Factory/assets/1871703/fb7cfb73-2084-4c84-8c43-f9f2c2bde61d) version 0.8.0 Thanks!

bitsandbytes 0.43.1

Tried these ``` pip uninstall peft pip install git+https://github.com/huggingface/peft.git ``` and getting the same error: ``` rank1]: File "...../LLaMA-Factory/v/lib/python3.11/site-packages/peft/tuners/lora/dora.py", line 74, in forward [rank1]: x_eye = torch.eye(lora_A.weight.shape[1], device=lora_A.weight.device, dtype=x.dtype) [rank1]:...

I had to use another machine with another IP to successfully run it.

My issue seems to be both stopping and repetitive sentences.