CodeT5 icon indicating copy to clipboard operation
CodeT5 copied to clipboard

got assertion error while loading the CodeT5+ models 16B-Instruct, 6B and 2B like this "AssertionError: Config has to be initialized with encoder and decoder config". how to solve this issue

Open Tarak200 opened this issue 1 year ago • 6 comments

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

checkpoint = "Salesforce/codet5p-2b" device = "cuda" # for GPU usage or "cpu" for CPU usage

tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint, torch_dtype=torch.float16, trust_remote_code=True).to(device)

encoding = tokenizer("def print_hello_world():", return_tensors="pt").to(device) encoding['decoder_input_ids'] = encoding['input_ids'].clone() outputs = model.generate(**encoding, max_length=15) print(tokenizer.decode(outputs[0], skip_special_tokens=True))

used the same code from huggingface, encountered the following error : AssertionError: Config has to be initialized with encoder and decoder config, How to resolve this issue?

Tarak200 avatar Oct 14 '24 12:10 Tarak200

Hi @Tarak200 , I am also getting the same error while inferencing using Codet5p 2B model. Were you able to find a solution? Thanks.

Roisan avatar Jan 10 '25 06:01 Roisan

Hi @Roisan @Tarak200 , I am getting the same error. Did you find a solution? Thanks!

angelocurti avatar Feb 18 '25 22:02 angelocurti

@angelocurti Hello, I haven't been able to find a solution. Any suggestions from your end will be highly appreciated.

Thank you.

Roisan avatar Feb 21 '25 04:02 Roisan

Creating the generation config solved my problem.

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer from transformers import GenerationConfig import torch

checkpoint = "Salesforce/codet5p-2b" device = "cuda" # for GPU usage or "cpu" for CPU usage

Select a device (CPU if no GPU is available, otherwise the first GPU)

device = torch.device("cuda" if torch.cuda.is_available() else "cpu") print(f"Selected device: {device}")

tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint, torch_dtype=torch.float16, trust_remote_code=True).to(device)

python_function = "write code for matrix multiplication" encoding = tokenizer(python_function, return_tensors="pt").to(device) encoding['decoder_input_ids'] = encoding['input_ids'].clone()

# Create a GenerationConfig with custom settings generation_config = GenerationConfig( max_length=950 )

outputs = model.generate(**encoding, generation_config=generation_config) print(tokenizer.decode(outputs[0], skip_special_tokens=True))

ardhiwiratamaby avatar Mar 04 '25 20:03 ardhiwiratamaby

@ardhiwiratamaby Thank you for providing the solution, I am now able to infer the 2B model.

Roisan avatar Mar 06 '25 07:03 Roisan

Hello, @ardhiwiratamaby, I got assertion error while full fine-tuning CodeT5+ 2B: File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/trainer.py", line 2245, in train return inner_training_loop( ^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/trainer.py", line 2627, in _inner_training_loop self._maybe_log_save_evaluate( File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/trainer.py", line 3103, in _maybe_log_save_evaluate self._save_checkpoint(model, trial) File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/trainer.py", line 3200, in _save_checkpoint self.save_model(output_dir, _internal_call=True) File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/trainer.py", line 3902, in save_model self._save(output_dir) File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/trainer.py", line 4006, in _save self.model.save_pretrained( File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3337, in save_pretrained misplaced_generation_parameters = model_to_save.config._get_non_default_generation_parameters() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/train-model/lib/python3.11/site-packages/transformers/configuration_utils.py", line 1079, in _get_non_default_generation_parameters default_config = self.class() ^^^^^^^^^^^^^^^^ File "/root/.cache/huggingface/modules/transformers_modules/Salesforce/codet5p-2b/0083d4d638746e6c9ee3dbd504e6dd68738e3c87/configuration_codet5p.py", line 78, in init "encoder" in kwargs and "decoder" in kwargs AssertionError: Config has to be initialized with encoder and decoder config. How to solve this issue.

latuan1 avatar May 18 '25 09:05 latuan1