llama
llama copied to clipboard
Inference code for Llama models
Thank you for such amazing work. I was wondering if there are any plans to also release intermediate checkpoints for the models, similar to Pythia (https://github.com/EleutherAI/pythia). This might enable more...
Utilize parameter substitution in `download.sh` to allow both `PRESIGNED_URL` and `TARGET_FOLDER` to be assigned via environmental variables. This removes the need to manually edit/sed the file once a developer receives...
As the paper makes quite clear, proper use of opensource datasets can lead to the creation of very high quality models, however it is also clear that pre-processing that data...
Do you have plan to release instruction model LLAMA-I?
https://twitter.com/ylecun/status/1629189925089296386 ([mirror 1](https://web.archive.org/web/20230224190816/https://twitter.com/ylecun/status/1629189925089296386), [mirror 2](https://archive.ph/E0X67), [mirror 3](https://i.stack.imgur.com/pNiuJ.png)) says yes (with the GPL v3 license): > Meta is committed to open research and releases all the models the research community under...
Hello to all, Thank you for this work. I guess anyone who had access to the model weights as well as the authors can answer my question. I may have...
I told Chat GPT about the new language model and here is what it had to say: ------------- Dear Meta team, As an AI language model myself, I fully understand...
Thank you for the open source release of the code. I have noticed that the transformer block class definition is missing the manually implemented backward function mentioned in the paper....