lit-llama icon indicating copy to clipboard operation
lit-llama copied to clipboard

This codebase has so many errors it is completely useless and unusable

Open Abecid opened this issue 2 years ago • 2 comments

precision = "bf16-true"

is unsupported but in the code for lora-finetuning

fabric.init_module

causes a does not exist error.

with fabric.device

results in an error in full fine-tuning script

Overall very poor experience and poor documentation. Garbage

Abecid avatar Aug 07 '23 08:08 Abecid

I don't know about the with fabric.device but let me address the other two

  1. precision = "bf16-true"

  2. fabric.init_module

with more explicit warnings and suggestions via a PR shortly.

rasbt avatar Aug 08 '23 18:08 rasbt

@Abecid What error are you getting with bfloat16. I think it's only supported in Ampere and newer, but it appears that it now also works on older T4's and CPU. Just tested it. Maybe it's a PyTorch version thing.

If you have time and don't mind spending a few more minutes, could you let me know the error code you are getting and PyTorch version to look into it further? I could then add a more explicit warning to save the hassle for future users. Screenshot 2023-08-08 at 1 15 44 PM

rasbt avatar Aug 08 '23 18:08 rasbt