OpenPrompt icon indicating copy to clipboard operation
OpenPrompt copied to clipboard

'PromptForGeneration' object has no attribute 'can_generate'

Open derek-kam opened this issue 2 years ago • 6 comments

I am trying tutorial 4.1_all_tasks_are_generation, the training run for a while, then it stopped because of the following errors, any ideas how to solve this?

Train: 2%|███▋ | 500/20000 [11:36<7:37:26, 1.41s/it, loss=45.9]Traceback (most recent call last): File "/home/opc/gpt-dev/OpenPrompt/tutorial/4.1_all_tasks_are_generation.py", line 399, in val_acc = evaluate(prompt_model, validation_dataloader) File "/home/opc/gpt-dev/OpenPrompt/tutorial/4.1_all_tasks_are_generation.py", line 302, in evaluate _, output_sentence = prompt_model.generate(inputs, **generation_arguments, verbose=False) File "/home/opc/.local/lib/python3.9/site-packages/openprompt/pipeline_base.py", line 499, in generate output_sequences = super().generate(**batch, **input_generation_kwargs, pad_token_id=self.tokenizer.pad_token_id, eos_token_id=self.tokenizer.eos_token_id) File "/home/opc/.local/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/opc/.local/lib/python3.9/site-packages/transformers/generation/utils.py", line 1246, in generate self._validate_model_class() File "/home/opc/.local/lib/python3.9/site-packages/transformers/generation/utils.py", line 1101, in _validate_model_class if not self.can_generate(): File "/home/opc/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1614, in getattr raise AttributeError("'{}' object has no attribute '{}'".format( AttributeError: 'PromptForGeneration' object has no attribute 'can_generate'

derek-kam avatar May 19 '23 04:05 derek-kam

Same problem with 2.1_conditional_generation.py. Is this a bug in PromptForGeneration?

derek-kam avatar May 19 '23 06:05 derek-kam

You can try to uninstall transformer and install transformer==4.19.0 then it works.

PengZirong avatar Jul 03 '23 06:07 PengZirong

Faced the same error. tried with transformers 4.19.0 and it worked. But 4.19.0 doesnt have LlamaConfig, LlamaForCausalLM or LlamaTokenizer.

Does anyone know how can i use llama2 then on openprompt and not face the can_generate error?

UmerTariq1 avatar Sep 11 '23 14:09 UmerTariq1

Faced the same error. tried with transformers 4.19.0 and it worked. But 4.19.0 doesnt have LlamaConfig, LlamaForCausalLM or LlamaTokenizer.

Does anyone know how can i use llama2 then on openprompt and not face the can_generate error?

I think it would be great if this function would be supported.

milliemaoo avatar Sep 27 '23 17:09 milliemaoo

Faced the same error. tried with transformers 4.19.0 and it worked. But 4.19.0 doesnt have LlamaConfig, LlamaForCausalLM or LlamaTokenizer.

Does anyone know how can i use llama2 then on openprompt and not face the can_generate error?

Hello, I'm facing the same problem. Did you solved it?

HuiHuiSun avatar Jul 10 '24 03:07 HuiHuiSun