starcoder icon indicating copy to clipboard operation
starcoder copied to clipboard

Home of StarCoder: fine-tuning & inference!

Results 127 starcoder issues
Sort by recently updated
recently updated
newest added
trafficstars

As per title. Right now the plugin is only published on the proprietary VS Code marketplace. Would it be possible to publish it on OpenVSX too? Then VSCode derived editors...

Is it possible to do java-cs/cs-java code translations as mentioned in https://github.com/salesforce/CodeT5 ?

Can I deploy the model locally in a non-Internet environment and use StarCoder via the VS Code plugin? If so, how should I do it?

--------------------------------------------------------------------------- KeyError Traceback (most recent call last) /tmp/ipykernel_1004512/2718782402.py in 2 tokenizer = AutoTokenizer.from_pretrained(checkpoint,use_auth_token=True) 3 # to save memory consider using fp16 or bf16 by specifying torch.dtype=torch.float16 for example ----> 4...

From the hugging face [blog](https://huggingface.co/blog/starcoder) all code for data preprocessing will be released Where are these codes? Could you provide a link? Thanks!

Hello, was the model also trained on pull/merge requests? Is it reasonable to try to use for code reviews? Thanks!

tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device) Does anyone could help me to resolve the problem ,T4 has almost 15G GPU memory ![image](https://github.com/bigcode-project/starcoder/assets/15274284/bf1933a1-153b-4798-9220-e59f1234c264) And if I use: with torch.device(device): model =...

is it possible to release the model as serialized onnx file probably it's a good idea to release some sample code with onnx Inference engine with public restful API