magicoder
magicoder copied to clipboard
[ICML'24] Magicoder: Empowering Code Generation with OSS-Instruct
After experimenting with text-generation-webui by oobabooga I found the following things -Magicoder models are all instruct only models (no chat/chat-instruct) -you need to create a new custom template under **parameters/instruction...
I got the same problem when use the quick start script to run an inference task. Just the same as this issue: https://github.com/ise-uiuc/magicoder/issues/22 what should i do to solve this...
There is an input format mismatch between the eval and training process. Do you intend to **_emphasize_** the problem before the model generates its output? When doing the Humaneval(+) eval,...
Hi dear: Thanks for your open source, but when i finetuned (whatever full parameters or LoRa ) on my dataset, catastrophic forgetting kept coming up (decrease in performance on the...
Just started reproducing Magicoder and could not help wondering, would a bigger OSS-Instruct dataset work better and how much better? PS. There are 12,000,000 files in Python inside bigcode/Starcoderdata, with...
i want to ask if I can replace the Dilated attention with Attention used in the based model and do the fine-tuning, the idea behind this is to reduce the...
try to use magicoder with ollama on MacAir M1 (16G), it works for other model, but when I run this, got error ``` ... ggml_metal_init: GPU name: Apple M1 ggml_metal_init:...
Sorry if this was mentioned before, but is there a stock prompt template in ooba's text-gen that works with this?