CodeT5 icon indicating copy to clipboard operation
CodeT5 copied to clipboard

Home of CodeT5: Open Code LLMs for Code Understanding and Generation

Results 96 CodeT5 issues
Sort by recently updated
recently updated
newest added

Dear Team, I tried to train the model with 2 core GPU as 0,1 I faced the following problem, which i have not faced with 1 core GPU. Could you...

add two codet5-large checkpoints

cla:missing

Hi, I trained a model on my own task and subtask but when running the model I get the below `error 1` that config.json is missing. When I look at...

In the paper, there is a task that lets a model to predict whether a code token is identifier or not. Can you explain more detail about how to do...

training on codet5_base works, how can I fix this? ``` CUDA_VISIBLE_DEVICES=0 python /home/aldo/CodeT5/run_gen.py --do_train --do_eval --do_eval_bleu --do_test --task concode --sub_task none --model_type codet5 --data_num 100 --num_train_epochs 1 --warmup_steps 10 --learning_rate...

I found that the released checkpoints do not include concode for codet5-small? My reproduction on it is lower than what is shown in the paper. Can you provide the checkpoint...

Hi. Me and my two colleagues are interested in replicating the results of CodeT5-base on code generation task with our own dataset. However we're having a few hiccups on preprocessing...

Hello, Thanks to your work and public models! However, when I tried to use the example code provided in your repository with CodeT5+ 2B / 6B, I was not getting...

Thanks to your wonderful work. However, I wonder how can we use CodeT5+ to generate code based on the natural language description?

For example, how to set to use only encoder mode? Explicitly add mode='encoder-only' when encoding sentences like Unixcoder?