LLaVA
LLaVA copied to clipboard
[Usage] TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
Describe the issue
Issue: Model Worker is showing this error when i chat with it. TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
Log:
2024-03-04 08:41:32 | ERROR | stderr | Exception in thread Thread-3 (generate): 2024-03-04 08:41:32 | ERROR | stderr | Traceback (most recent call last): 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/threading.py", line 1016, in _bootstrap_inner 2024-03-04 08:41:32 | ERROR | stderr | self.run() 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/threading.py", line 953, in run 2024-03-04 08:41:32 | ERROR | stderr | self._target(*self._args, **self._kwargs) 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context 2024-03-04 08:41:32 | ERROR | stderr | return func(*args, **kwargs) 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/LLaVA/llava/model/language_model/llava_llama.py", line 138, in generate 2024-03-04 08:41:32 | ERROR | stderr | return super().generate( 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context 2024-03-04 08:41:32 | ERROR | stderr | return func(*args, **kwargs) 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 1592, in generate 2024-03-04 08:41:32 | ERROR | stderr | return self.sample( 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 2696, in sample 2024-03-04 08:41:32 | ERROR | stderr | outputs = self( 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl 2024-03-04 08:41:32 | ERROR | stderr | return self._call_impl(*args, **kwargs) 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl 2024-03-04 08:41:32 | ERROR | stderr | return forward_call(*args, **kwargs) 2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward 2024-03-04 08:41:32 | ERROR | stderr | output = old_forward(*args, **kwargs) 2024-03-04 08:41:32 | ERROR | stderr | TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
Screenshots:
I encountered the same error while running "eval_mode".
eval_model(args) You are using a model of type llava to instantiate a model of type llava_llama. This is not supported for all configurations of models and can yield errors. Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:04<00:00, 2.27s/it] /root/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:410: UserWarning:
do_sample
is set toFalse
. However,temperature
is set to0
-- this flag is only used in sample-based generation modes. You should setdo_sample=True
or unsettemperature
. warnings.warn( Traceback (most recent call last): File "", line 1, in File "/home/humaodi/code/LLaVA/llava/eval/run_llava.py", line 115, in eval_model output_ids = model.generate( File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/humaodi/code/LLaVA/llava/model/language_model/llava_llama.py", line 137, in generate return super().generate( File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 1544, in generate return self.greedy_search( File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 2404, in greedy_search outputs = self( File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
I encountered the same error too. Any ideas?
Got this error as well. Not been able to fix yet. Tracking this Issue.
I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2
I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2
Problem solved. Thanks!
Problem solved. Thanks!
this was because transformers 4.38.0 added static cache. So have to use any version below.
您的来信已收到,祝您每天有个好心情。
https://github.com/haotian-liu/LLaVA/issues/1218#issuecomment-1977497811 Thank you so much, my problem was solved via degrading transformers==4.37.2.The transformers package interfaces changes frequently, which makes me so confused that I have to spend too much time to debug the meaningless bugs.
I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2
TY <3
I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2
thank you!!!!!!
new models use transformers>4.39 is there a way to actually fix this ?
您的来信已收到,祝您每天有个好心情。
TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position' my transformers version is 4.37.2,also have this problem.
Hey, adding cache_position=None to the forward method also works. Check here
您的来信已收到,祝您每天有个好心情。