CodeT5 icon indicating copy to clipboard operation
CodeT5 copied to clipboard

Home of CodeT5: Open Code LLMs for Code Understanding and Generation

Results 96 CodeT5 issues
Sort by recently updated
recently updated
newest added

Hello everyone! Is anyone already come up with a script for fine-tuning CodeT5+ (2B,6B) on its own seq2seq task?

My generation time is 70x higher with codet5p_2b compared to codet5p_770m for the same input text. Why is this?

Hello, I tried to fine-tune codet5p-2b. I loaded the model from huggingface and I got an error saying CUDA out of memory, then I tried to load the model into...

Can CodeT5 detect bug in the Python or Java code?

I was wondering if these retrieval-augmented code generation models will be released. Thanks! ![table8](https://github.com/salesforce/CodeT5/assets/135301588/56d76af3-3860-4de3-ad69-ab4e81c49fcb)

Is there any way to create embeddings as the base model like OpenAI embedding endpoints?

These slight code adjustments aim to enhance conciseness of the code by employing optimized NumPy functions.

cla:missing

``` from transformers import AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("Salesforce/codet5p-220m-py") code = """ # this is a code comment """ print(tokenizer.decode(tokenizer(aux)["input_ids"])) ``` output: ``` # this is a code comment ``` It...