NanoCode012

Results 163 comments of NanoCode012

Edit: This is incorrect. See comment below: https://github.com/OpenAccess-AI-Collective/axolotl/issues/945#issuecomment-1994156576 --- @fozziethebeat , you can run it like this. ```bash pip3 install "axolotl[flash-attn,deepspeed] @ git+https://github.com/OpenAccess-AI-Collective/axolotl" ```

Oh sorry, I did not read the earlier context. Hmm, it would seem that it's necessary to clone as per above discussion.

Closing this as duplicate of https://github.com/OpenAccess-AI-Collective/axolotl/issues/945 The current workaround is to git clone and pip install following readme : https://github.com/OpenAccess-AI-Collective/axolotl/issues/945#issuecomment-1900798307

Hey, PR #786 allows for `test_dataset:` now. We also have `bench_dataset` if you want to run benchmarks (more info: https://github.com/OpenAccess-AI-Collective/axolotl/issues/311#issuecomment-2028311885).

Hey, there was a discussion about this in discord. May I ask if you can try to checkout to the following commit and see if it works? ``` git checkout...

Would it be possible to test with no eval? set `val_set_size: 0`. It seems to work for me.

@FupsGamer , I think this issue is a combination of sample packing + FA. If you use the current main without packing, does it work?

Hey, I've did a re-test of this a few weeks back and the vram for qlora should be correct/better now. If this issue reoccurs, please let us know.

> Using the following command to merge models, there is an error message: Hey, seems like the PeftModel loading failed. Can you check the files in `lora_model_dir` are valid (aren't...