codellama
codellama copied to clipboard
Inference code for CodeLlama models
I removed all local files, re-cloned the repository, and requested a new download link, but when I run ./download.sh and entered the download link, chose the model, it still gave...
When I try to run python script, I get this error : ``` ypeError Traceback (most recent call last) Cell In[6], line 10 6 max_batch_size = 4 7 max_gen_len: Optional[int]...
I thinks we need: - VS Code plugins - Model loading methods - API Server to communicate between plugins and model backend Is here we have mature solution?
Whether to support generating sql language
Apart from contextual information, how can I make the model recognize the import class information and similar code fragments as prompts when I want them to be passed into the...
Dear Maintainer, I hope this message finds you well. I have been trying to reproduce the performance of CodeLlama on the Human-Eval dataset, as mentioned in the paper. However, despite...
Hello. I'm trying to finetune code llama for a multifile code generation task on my private repository. The goal is to have the LLM generate code for some common bugs...
In [Meta AI blog](https://ai.meta.com/blog/code-llama-large-language-model-coding/) Why the 34B base and all Python versions have not been trained with FIM?
Has any finetune code ? especially for how data sets are prepared. Looking forward to your reply very much , thank you !
Hi, Apologies if the solution is obvious but I'm new to this. When running the example infilling script: `torchrun --nproc_per_node 1 example_infilling.py --ckpt_dir CodeLlama-7b/ --tokenizer_path CodeLlama-7b/tokenizer.model --max_seq_len 192 --max_batch_size 4`...