CogVLM icon indicating copy to clipboard operation
CogVLM copied to clipboard

TypeError

Open conheaven opened this issue 1 year ago • 1 comments

System Info / 系統信息

NVIDIA-SMI 535.104.05 Driver Version: 535.104.05 CUDA Version: 12.2

python -c "import transformers; print(transformers.version)" 4.45.2

Python 3.10.15

Who can help? / 谁可以帮助到您?

@zr

Information / 问题信息

  • [X] The official example scripts / 官方的示例脚本
  • [ ] My own modified scripts / 我自己修改的脚本和任务

Reproduction / 复现过程

python cli_demo_hf.py --bf16

Expected behavior / 期待表现

输入图片路径和提问后报错 image path >>>>> /data1/khw/cogvlm/dataset/dog.jpeg Human:what is this Traceback (most recent call last): File "/data1/khw/cogvlm/CogVLM/basic_demo/cli_demo_hf.py", line 127, in outputs = model.generate(**inputs, **gen_kwargs) File "/data1/khw/miniconda3/envs/cogvlm/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/data1/khw/miniconda3/envs/cogvlm/lib/python3.10/site-packages/transformers/generation/utils.py", line 2047, in generate result = self._sample( File "/data1/khw/miniconda3/envs/cogvlm/lib/python3.10/site-packages/transformers/generation/utils.py", line 3055, in _sample model_kwargs = self._update_model_kwargs_for_generation( File "/home/khw/.cache/huggingface/modules/transformers_modules/cogvlm-chat-hf/modeling_cogvlm.py", line 750, in _update_model_kwargs_for_generation model_kwargs["past_key_values"] = self._extract_past_from_model_output( TypeError: GenerationMixin._extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format'

conheaven avatar Oct 21 '24 11:10 conheaven

transformers==4.31.2

MachineDora avatar Dec 03 '24 13:12 MachineDora