lucy

Results 5 comments of lucy

@danielhanchen downgrading to 2.1.0 does not fix this issue in new installs of llamafac, it then returns to the error: ``` File "", line 65, in _rms_layernorm_forward ValueError: Pointer argument...

``` # pip show flash-attn Name: flash-attn Version: 2.5.5 Summary: Flash Attention: Fast and Memory-Efficient Exact Attention Home-page: https://github.com/Dao-AILab/flash-attention Author: Tri Dao Author-email: [email protected] License: Location: /root/miniconda3/envs/py3.11/lib/python3.11/site-packages Requires: einops, ninja,...

I tested it and a few notes: - `unsloth/tinyllama` works - should qlora.yml example point to `unsloth/tinyllama` instead? - https://github.com/hiyouga/LLaMA-Factory works with the non unsloth variant too, what are they...

yes, just using this one as model input: https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T

@ButteredCats thanks! is there any hope that this will get upstreamed at all? seems to have been ready for a while now; also regarding firefox, indeed thats what I ended...