CodeT5 icon indicating copy to clipboard operation
CodeT5 copied to clipboard

Home of CodeT5: Open Code LLMs for Code Understanding and Generation

Results 96 CodeT5 issues
Sort by recently updated
recently updated
newest added

If I want to adapt the codet5p-110m-embedding pre-training model released by CodeT5+ to obtain the code embeddings of my dataset, would I need to fine-tune it?

Thank you for your interest in utilizing our `Codet5` model for code similarity tasks. I have a query regarding its usage in test mode, specifically when comparing only two code...

Hello, I would like to inquire if codet5p can generate code for the masked part similar to codet5. Also, what is the method for loading the model? Currently, I am...

I'm trying to finetune the concode task using 'code' as both input & output, instead of 'nl' & 'code'. I wanted to know if we can directly use the concode...

From the paper "CODEGEN: AN OPEN LARGE LANGUAGE MODEL FOR CODE WITH MULTI-TURN PROGRAM SYNTHESIS", the architecture of CodeGen follows a standard transformer decoder with left-to-right causal masking. How do...

Hello I wonder if we can finetune the text-to-code retrieval task for Text-to-Code Retrieval like UniXcoder at [here](https://github.com/microsoft/CodeBERT/tree/master/UniXcoder/downstream-tasks/code-search). I have run the zero-shot code retrieval for Javascript. It shows that...

Hello, thank you very much for your excellent work. If I want to use CodeT5+ for code clone detection, but there's no parameter similar to ’hidden_size‘ in CodeT5+, what should...

I have already begun the fine tuning. However, some thing wrong: ``` ***** Running training ***** Num examples = 2830 Num Epochs = 6 Instantaneous batch size per device =...