CodeT5 icon indicating copy to clipboard operation
CodeT5 copied to clipboard

If codet5p can generate code for the masked part <extra_id_0>

Open 9aLucky opened this issue 1 year ago • 0 comments

Hello, I would like to inquire if codet5p can generate code for the masked part <extra_id_0> similar to codet5. Also, what is the method for loading the model?

Currently, I am trying to load codet5p-16b in the same way as codet5-base, but I encounter an error during the generation phase.

` from transformers import RobertaTokenizer, T5ForConditionalGeneration

tokenizer = RobertaTokenizer.from_pretrained(CODET5P_16B)
model = T5ForConditionalGeneration.from_pretrained(CODET5P_16B)
input_ids = tokenizer(text, return_tensors="pt").input_ids
# simply generate a single sequence
generated_ids = model.generate(input_ids, max_length=100)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))`

File "/Users/zhaojiuang/Desktop/code/ProgramRepair/codeT5/baseline_codeT5.py", line 106, in run_span generated_ids = model.generate(input_ids, max_length=100) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/transformers/generation_utils.py", line 1339, in generate model_kwargs = self._prepare_encoder_decoder_kwargs_for_generation( File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/transformers/generation_utils.py", line 583, in _prepare_encoder_decoder_kwargs_for_generation model_kwargs["encoder_outputs"]: ModelOutput = encoder(**encoder_kwargs) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 941, in forward inputs_embeds = self.embed_tokens(input_ids) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/torch/nn/modules/sparse.py", line 162, in forward return F.embedding( File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 2233, in embedding return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse) IndexError: index out of range in self

thanks for your help.

9aLucky avatar Jan 24 '24 13:01 9aLucky