codellama icon indicating copy to clipboard operation
codellama copied to clipboard

Inference code for CodeLlama models

Results 109 codellama issues
Sort by recently updated
recently updated
newest added

I tried running the example code given but theres an EOFerror, out of input that keeps popping up. Anyways to solve it? torchrun --nproc_per_node 1 example_completion.py --ckpt_dir CodeLlama-7b --tokenizer_path CodeLlama-7b/tokenizer.model...

model-usage
compability

Is there any way to create embeddings with code llama as the base model like OpenAI embedding endpoints?

When I use codellama or codellama-python to finish the continuation of a prompt, a lot of '\n' are outputed in the end until it reaches the max_gen_len. Is there any...

Hi, on mac I got the following error: RuntimeError: Distributed package doesn't have NCCL built in raise RuntimeError("Distributed package doesn't have NCCL " "built in") RuntimeError: Distributed package doesn't have...

model-usage
compability

I am trying to download the 7b-Python model using the instructions in the README file but I'm getting this when requesting any of the 7b models. >> Enter the list...

download-install

Tried to run: ``` torchrun --nproc_per_node 1 codellama/example_instructions.py \ --ckpt_dir /home/ubuntu/model/ \ --tokenizer_path /home/ubuntu/model/tokenizer.model \ --max_seq_len 4512 --max_batch_size 4 ``` I have a long prompt (4000 tokens). I have 4...

I'm having a problem with fine-tuning the codellama-7b-Instruct model for a programming language. The issue is that the model seems to focus too much on the new dataset , and...

I get an email with a download URL, but when I run a download URL, I get these options: **Enter the list of models to download without spaces (7b,13b,34b,7b-Python,13b-Python,34b-Python,7b-Instruct,13b-Instruct,34b-Instruct), or...

when i tried codellama-7b and codellama-34b to test code completion, all results were garbled code. facilities: OS: Red hat 4.8.5-36 GCC:4.8.5 32G V100 cuda:11.7 torch: 2.0.0 fairscale 0.4.13 sentencepiece: 0.1.99...