[Bug] Use vllm to deploy InterVL-78B error
Checklist
- [x] 1. I have searched related issues but cannot get the expected help.
- [x] 2. The bug has not been fixed in the latest version.
- [x] 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
I use command to deploy InterVL-78B model, could you help me find out the problem? It is ok when I use same envirnment to deploy 8B model:
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openai
The error below:
INFO: Started server process [3587]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
INFO: 127.0.0.1:38652 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
INFO: 127.0.0.1:38654 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
INFO: 127.0.0.1:38666 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
INFO: 127.0.0.1:38668 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/debug.py:105: RuntimeWarning: coroutine 'MediaConnector.fetch_image_async' was never awaited
code: CodeType = compile(
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/debug.py:105: RuntimeWarning: coroutine 'AsyncMultiModalItemTracker.all_mm_data' was never awaited
code: CodeType = compile(
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
INFO: 127.0.0.1:38682 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template
ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last):
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception()
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template
ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception()
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source)
ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code
ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception:
ERROR 09-14 13:08:40 [serving_chat.py:222]
ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last):
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion
ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat
ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template(
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner
ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs)
ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^
ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template
ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e
ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str
INFO: 127.0.0.1:38694 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
INFO: 127.0.0.1:38700 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
INFO: 127.0.0.1:38704 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
Reproduction
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openai
Environment
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openai
Error traceback
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openai
试试去掉 --chat-template-content-format openai?vLLM 应该是能自动识别chat template格式的
试试去掉
--chat-template-content-format openai?
我起初是没有这个参数的,报错是一样的,然后它报错提示有这个建议,我才添加上的,然后报错还是一样的
ValueError: can only concatenate str (not "list") to str
一般这个报错就是 string 格式的 chat template 被当成 openai 格式的用了。你用的 vLLM 版本是多少?
我起初是没有这个参数的,报错是一样的,然后它报错提示有这个建议,我才添加上的,然后报错还是一样的
能提供一下复现用的 request 吗?我刚才在 v0.10.2 上用 OpenGVLab/InternVL3-78B 的 tokenizer 试了一下好像复现不出来。
(APIServer pid=68913) INFO 09-14 17:44:40 [logger.py:40] Received request chatcmpl-93671982cbc045b1aecad1acac43d514: prompt: '<|im_start|>system\n你是书生·万象,英文名是InternVL,是由上海人工智能实验室、清华大学及多家合作单位联合开发的多模态大语言模型。<|im_end|>\n<|im_start|>user\nWhat are the animals in these images?<image>\n<image>\n<|im_end|>\n<|im_start|>assistant\n', params: SamplingParams(n=1, presence_penalty=0.0, frequency_penalty=0.0, repetition_penalty=1.0, temperature=1.0, top_p=1.0, top_k=0, min_p=0.0, seed=None, stop=[], stop_token_ids=[], bad_words=[], include_stop_str_in_output=False, ignore_eos=False, max_tokens=64, min_tokens=0, logprobs=None, prompt_logprobs=None, skip_special_tokens=True, spaces_between_special_tokens=True, truncate_prompt_tokens=None, guided_decoding=None, extra_args=None), prompt_token_ids: None, prompt_embeds shape: None, lora_request: None.
(APIServer pid=68913) WARNING 09-14 17:44:44 [registry.py:183] InternVLProcessor did not return `BatchFeature`. Make sure to match the behaviour of `ProcessorMixin` when implementing custom processors.
(APIServer pid=68913) INFO 09-14 17:44:44 [async_llm.py:318] Added request chatcmpl-93671982cbc045b1aecad1acac43d514.
(APIServer pid=68913) INFO: 127.0.0.1:42914 - "POST /v1/chat/completions HTTP/1.1" 200 OK
或者你试试拉取一下最新的 checkpoint?这两天 InternVL3的仓库刚修了个chat template的issue: https://huggingface.co/OpenGVLab/InternVL3-78B/commit/4a42b163cc0e222e97b10b562eee98da6bf26077
ValueError: can only concatenate str (not "list") to str
一般这个报错就是 string 格式的 chat template 被当成 openai 格式的用了。你用的 vLLM 版本是多少?
我起初是没有这个参数的,报错是一样的,然后它报错提示有这个建议,我才添加上的,然后报错还是一样的
能提供一下复现用的 request 吗?我刚才在 v0.10.2 上用 OpenGVLab/InternVL3-78B 的 tokenizer 试了一下好像复现不出来。
(APIServer pid=68913) INFO 09-14 17:44:40 [logger.py:40] Received request chatcmpl-93671982cbc045b1aecad1acac43d514: prompt: '<|im_start|>system\n你是书生·万象,英文名是InternVL,是由上海人工智能实验室、清华大学及多家合作单位联合开发的多模态大语言模型。<|im_end|>\n<|im_start|>user\nWhat are the animals in these images?<image>\n<image>\n<|im_end|>\n<|im_start|>assistant\n', params: SamplingParams(n=1, presence_penalty=0.0, frequency_penalty=0.0, repetition_penalty=1.0, temperature=1.0, top_p=1.0, top_k=0, min_p=0.0, seed=None, stop=[], stop_token_ids=[], bad_words=[], include_stop_str_in_output=False, ignore_eos=False, max_tokens=64, min_tokens=0, logprobs=None, prompt_logprobs=None, skip_special_tokens=True, spaces_between_special_tokens=True, truncate_prompt_tokens=None, guided_decoding=None, extra_args=None), prompt_token_ids: None, prompt_embeds shape: None, lora_request: None. (APIServer pid=68913) WARNING 09-14 17:44:44 [registry.py:183] InternVLProcessor did not return `BatchFeature`. Make sure to match the behaviour of `ProcessorMixin` when implementing custom processors. (APIServer pid=68913) INFO 09-14 17:44:44 [async_llm.py:318] Added request chatcmpl-93671982cbc045b1aecad1acac43d514. (APIServer pid=68913) INFO: 127.0.0.1:42914 - "POST /v1/chat/completions HTTP/1.1" 200 OK或者你试试拉取一下最新的 checkpoint?这两天 InternVL3的仓库刚修了个chat template的issue: https://huggingface.co/OpenGVLab/InternVL3-78B/commit/4a42b163cc0e222e97b10b562eee98da6bf26077
如下是我的库版本号,我今天试试更新chat template以及最新版本的vllm再测试下,感谢兄弟!
vllm 0.9.2+cu126 pypi_0 pypi
transformers 4.53.3 pypi_0 pypi
如下是我的request信息,图片可以随便照一张即可,我的主要问题出现在多模态的request上,普通的文本没有问题:
def image_url(image_path):
# print(image_path)
image_url = []
# 如果 image_path 不是列表,就将它变成一个包含单个元素的列表
if isinstance(image_path, str):
image_path = [image_path] # 将字符串包装成列表
for image in image_path: # 直接遍历图像路径列表
if image is None: # 如果图像路径是None,跳过该项
image_url.append(None)
continue
# 读取图像数据
with open(image, 'rb') as image_file:
image_data = image_file.read()
# 将图像数据转换为base64编码
image_base64 = base64.b64encode(image_data).decode('utf-8')
url = f'data:image/jpeg;base64,{image_base64}'
image_url.append(url)
return image_url
image_path = r"E:\A_master\Ant_Agent\AndroidWorld_test\bon_android_world\image_2\1_raw_screenshot.png"
image_url = image_url(image_path)
# print(image_url[1])
messages_img = [
{
'role': 'system',
'content': 'You are good at reasoning and problem solving. You need to help user choose the best reasoning method and ONLY give back the choosen reasoning method, for example: "1 How could I devise an experiment to help solve that problem?'
},
{
"role": "user",
"content": [
{'type': 'text', 'text': '<image>\nIn order to solve the given task:\n<Task>\nYou are an expert in metric geometry - area domain. Your task is to tackle metric geometry - area problems.\n#Question:\n The diagram shows a parallelogram $W X Y Z$ with area $S$. The diagonals of the parallelogram meet at the point $O$. The point $M$ is on the edge $Z Y$. The lines $W M$ and $Z X$ meet at $N$. The lines $M X$ and $W Y$ meet at $P$. The sum of the areas of triangles $W N Z$ and $X Y P$ is $\\frac{1}{3} S$. What is the area of quadrilateral MNOP ?\n<image1> \n Choices: {\'A\': \'$\\\\frac{1}{6} S$\', \'B\': \'$\\\\frac{1}{8} S$\', \'C\': \'$\\\\frac{1}{10} S$\', \'D\': \'$\\\\frac{1}{12} S$\', \'E\': \'$\\\\frac{1}{14} S$\'}\n\n#Answer: Solve the problem and enclose the ultimate answer you choose from {\'A\': \'$\\\\frac{1}{6} S$\', \'B\': \'$\\\\frac{1}{8} S$\', \'C\': \'$\\\\frac{1}{10} S$\', \'D\': \'$\\\\frac{1}{12} S$\', \'E\': \'$\\\\frac{1}{14} S$\'} in \\boxed{} here. If it is a multiple choice question, only one letter is allowed in the "\\boxed{} for example: \\boxed{A}, else for example: \\boxed{20}. Please reply in plain text only, without using any formatting such as bold, italics, markdown, or any other styling.\n\n</Task>\nSelect ONE reasoning method that is crucial for solving the task above\nfrom all the reasoning method description given below:\n[\'127. Explore how human-centered design principles can be applied to create solutions that are intuitive, easy to adopt, and impactful.\', \'101. Consider how different cultural perspectives or values might influence the interpretation of the problem and its solutions.\', \'20 Are there any relevant data or information that can provide insights into the problem? If yes, what data sources are available, and how can they be analyzed?\', \'109. Examine how laws or regulations might impact the problem or the solutions, and identify any legal challenges that could arise.\', \'123. Use a "pre-mortem" analysis: Assume the solution failed and work backward to identify the potential reasons why.\']\n'},
{
"type": "image_url",
"image_url": {
# 'url': f'data:image/jpeg;base64,{image_url[0]}'
'url': image_url[0]
},
},
],
}
]
# breakpoint()
messages.append(messages_img)
具体的情形是"role": "user","content": [ 这边的content是列表形式,导致错误,模板无法拼接
ValueError: can only concatenate str (not "list") to str
一般这个报错就是 string 格式的 chat template 被当成 openai 格式的用了。你用的 vLLM 版本是多少?
我起初是没有这个参数的,报错是一样的,然后它报错提示有这个建议,我才添加上的,然后报错还是一样的
能提供一下复现用的 request 吗?我刚才在 v0.10.2 上用 OpenGVLab/InternVL3-78B 的 tokenizer 试了一下好像复现不出来。
(APIServer pid=68913) INFO 09-14 17:44:40 [logger.py:40] Received request chatcmpl-93671982cbc045b1aecad1acac43d514: prompt: '<|im_start|>system\n你是书生·万象,英文名是InternVL,是由上海人工智能实验室、清华大学及多家合作单位联合开发的多模态大语言模型。<|im_end|>\n<|im_start|>user\nWhat are the animals in these images?<image>\n<image>\n<|im_end|>\n<|im_start|>assistant\n', params: SamplingParams(n=1, presence_penalty=0.0, frequency_penalty=0.0, repetition_penalty=1.0, temperature=1.0, top_p=1.0, top_k=0, min_p=0.0, seed=None, stop=[], stop_token_ids=[], bad_words=[], include_stop_str_in_output=False, ignore_eos=False, max_tokens=64, min_tokens=0, logprobs=None, prompt_logprobs=None, skip_special_tokens=True, spaces_between_special_tokens=True, truncate_prompt_tokens=None, guided_decoding=None, extra_args=None), prompt_token_ids: None, prompt_embeds shape: None, lora_request: None. (APIServer pid=68913) WARNING 09-14 17:44:44 [registry.py:183] InternVLProcessor did not return `BatchFeature`. Make sure to match the behaviour of `ProcessorMixin` when implementing custom processors. (APIServer pid=68913) INFO 09-14 17:44:44 [async_llm.py:318] Added request chatcmpl-93671982cbc045b1aecad1acac43d514. (APIServer pid=68913) INFO: 127.0.0.1:42914 - "POST /v1/chat/completions HTTP/1.1" 200 OK或者你试试拉取一下最新的 checkpoint?这两天 InternVL3的仓库刚修了个chat template的issue: https://huggingface.co/OpenGVLab/InternVL3-78B/commit/4a42b163cc0e222e97b10b562eee98da6bf26077
我更新vllm版本后,InterVL3-instruct仍旧报错,并且错误原因一致,我测试了同样的调用,在InterVL3_5 一切正常,我确认模型更新好的chat-template已经是最新的版本: vllm环境信息:
vllm 0.10.2 pypi_0 pypi
启动命令:
CUDA_VISIBLE_DEVICES=2 python -m vllm.entrypoints.openai.api_server --model /data/huxueyu/xt/models/InternVL3-8B-Instruct --served-model-name qwen2.5-vl-7b-instruct --port=8006 --limit-mm-per-prompt.image 2 --tensor-parallel-size 1 --seed 42 --trust-remote-code --gpu_memory_utilization 0.90
报错信息:
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] An error occurred in `transformers` while applying chat template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] Traceback (most recent call last):
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1462, in apply_hf_chat_template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] return tokenizer.apply_chat_template(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1640, in apply_chat_template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] rendered_chat, generation_indices = render_jinja_template(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] ^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 521, in render_jinja_template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] rendered_chat = compiled_template.render(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] ^^^^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] self.environment.handle_exception()
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] raise rewrite_traceback_stack(source=source)
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] File "<template>", line 2, in top-level template code
(APIServer pid=1058477) ERROR 09-15 07:05:30 [chat_utils.py:1475] TypeError: can only concatenate str (not "list") to str
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] Error in preprocessing prompt inputs
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] Traceback (most recent call last):
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1462, in apply_hf_chat_template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] return tokenizer.apply_chat_template(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1640, in apply_chat_template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] rendered_chat, generation_indices = render_jinja_template(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] ^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 521, in render_jinja_template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] rendered_chat = compiled_template.render(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] ^^^^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] self.environment.handle_exception()
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] raise rewrite_traceback_stack(source=source)
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "<template>", line 2, in top-level template code
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] TypeError: can only concatenate str (not "list") to str
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251]
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] The above exception was the direct cause of the following exception:
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251]
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] Traceback (most recent call last):
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 227, in create_chat_completion
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] ) = await self._preprocess_chat(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 795, in _preprocess_chat
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] request_prompt = apply_hf_chat_template(
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] ^^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] File "/home/huxueyu/miniconda3/envs/xt_vllm-latest/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1478, in apply_hf_chat_template
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] raise ValueError(str(e)) from e
(APIServer pid=1058477) ERROR 09-15 07:05:30 [serving_chat.py:251] ValueError: can only concatenate str (not "list") to str
(APIServer pid=1058477) INFO: 127.0.0.1:43442 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
I have also met this question, have you ever fixed it?
Hmmm, 这就很奇怪了。我还是复现不出来这个报错。~~能否提供一下复现用的请求?~~ (看漏了,我再试试)
我启动服务的命令:
vllm serve OpenGVLab/InternVL3-8B-Instruct --limit-mm-per-prompt.image 2 --tensor-parallel-size 2 --trust-remote-code
Request:
from openai import OpenAI
# Modify OpenAI's API key and API base to use vLLM's API server.
openai_api_key = "EMPTY"
openai_api_base = "http://localhost:8000/v1"
client = OpenAI(
# defaults to os.environ.get("OPENAI_API_KEY")
api_key=openai_api_key,
base_url=openai_api_base,
)
image_url_duck = "https://upload.wikimedia.org/wikipedia/commons/d/da/2015_Kaczka_krzy%C5%BCowka_w_wodzie_%28samiec%29.jpg"
image_url_lion = "https://upload.wikimedia.org/wikipedia/commons/7/77/002_The_lion_king_Snyggve_in_the_Serengeti_National_Park_Photo_by_Giles_Laurent.jpg"
chat_completion_from_url = client.chat.completions.create(
messages=[
{
"role": "user",
"content": [
{"type": "text", "text": "What are the animals in these images?"},
{
"type": "image_url",
"image_url": {"url": image_url_duck},
},
{
"type": "image_url",
"image_url": {"url": image_url_lion},
},
],
}
],
model=model,
max_completion_tokens=64,
)
result = chat_completion_from_url.choices[0].message.content
print("Chat completion output:", result)
输出:
Fetching 1 files: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 13751.82it/s]
(APIServer pid=7654) INFO 09-15 17:00:18 [chat_utils.py:528] Detected the chat template content format to be 'openai'. You can set `--chat-template-content-format` to override this.
(APIServer pid=7654) WARNING 09-15 17:00:23 [registry.py:183] InternVLProcessor did not return `BatchFeature`. Make sure to match the behaviour of `ProcessorMixin` when implementing custom processors.
(APIServer pid=7654) INFO 09-15 17:00:34 [loggers.py:123] Engine 000: Avg prompt throughput: 364.3 tokens/s, Avg generation throughput: 0.2 tokens/s, Running: 1 reqs, Waiting: 0 reqs, GPU KV cache usage: 2.3%, Prefix cache hit rate: 0.0%
(APIServer pid=7654) INFO: 127.0.0.1:54250 - "POST /v1/chat/completions HTTP/1.1" 200 OK
Chat completion output: I'm unable to identify or verify the identities of animals in images. However, the first image shows a duck on the water, and the second one features a lion in a grassy field. If you need more detailed information about these animals, you might consult natural history books or expert resources.
Hmmm, 这就很奇怪了。我还是复现不出来这个报错。~能否提供一下复现用的请求?~ (看漏了,我再试试)
我启动服务的命令:
vllm serve OpenGVLab/InternVL3-8B-Instruct --limit-mm-per-prompt.image 2 --tensor-parallel-size 2 --trust-remote-codeRequest:
from openai import OpenAI # Modify OpenAI's API key and API base to use vLLM's API server. openai_api_key = "EMPTY" openai_api_base = "http://localhost:8000/v1" client = OpenAI( # defaults to os.environ.get("OPENAI_API_KEY") api_key=openai_api_key, base_url=openai_api_base, ) image_url_duck = "https://upload.wikimedia.org/wikipedia/commons/d/da/2015_Kaczka_krzy%C5%BCowka_w_wodzie_%28samiec%29.jpg" image_url_lion = "https://upload.wikimedia.org/wikipedia/commons/7/77/002_The_lion_king_Snyggve_in_the_Serengeti_National_Park_Photo_by_Giles_Laurent.jpg" chat_completion_from_url = client.chat.completions.create( messages=[ { "role": "user", "content": [ {"type": "text", "text": "What are the animals in these images?"}, { "type": "image_url", "image_url": {"url": image_url_duck}, }, { "type": "image_url", "image_url": {"url": image_url_lion}, }, ], } ], model=model, max_completion_tokens=64, ) result = chat_completion_from_url.choices[0].message.content print("Chat completion output:", result)输出:
Fetching 1 files: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 13751.82it/s] (APIServer pid=7654) INFO 09-15 17:00:18 [chat_utils.py:528] Detected the chat template content format to be 'openai'. You can set `--chat-template-content-format` to override this. (APIServer pid=7654) WARNING 09-15 17:00:23 [registry.py:183] InternVLProcessor did not return `BatchFeature`. Make sure to match the behaviour of `ProcessorMixin` when implementing custom processors. (APIServer pid=7654) INFO 09-15 17:00:34 [loggers.py:123] Engine 000: Avg prompt throughput: 364.3 tokens/s, Avg generation throughput: 0.2 tokens/s, Running: 1 reqs, Waiting: 0 reqs, GPU KV cache usage: 2.3%, Prefix cache hit rate: 0.0% (APIServer pid=7654) INFO: 127.0.0.1:54250 - "POST /v1/chat/completions HTTP/1.1" 200 OKChat completion output: I'm unable to identify or verify the identities of animals in images. However, the first image shows a duck on the water, and the second one features a lion in a grassy field. If you need more detailed information about these animals, you might consult natural history books or expert resources.
可以告诉我,你的环境和机器吗?我用的A800,可能无法支持更高的CUDA版本,用不了最新的vllm,我的vllm版本是0.8.5
messages.append(messages_img)
@YuanDaoze ~~你这里的 messages 是什么样的,我感觉这里应该写成 messages.extend(messages_img)~~
Checklist
- [x] 1. I have searched related issues but cannot get the expected help.[x] 2. The bug has not been fixed in the latest version.[x] 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
I use command to deploy InterVL-78B model, could you help me find out the problem? It is ok when I use same envirnment to deploy 8B model:
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openaiThe error below:
INFO: Started server process [3587] INFO: Waiting for application startup. INFO: Application startup complete. ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str INFO: 127.0.0.1:38652 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request INFO: 127.0.0.1:38654 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request INFO: 127.0.0.1:38666 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str INFO: 127.0.0.1:38668 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request /root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/debug.py:105: RuntimeWarning: coroutine 'MediaConnector.fetch_image_async' was never awaited code: CodeType = compile( RuntimeWarning: Enable tracemalloc to get the object allocation traceback /root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/debug.py:105: RuntimeWarning: coroutine 'AsyncMultiModalItemTracker.all_mm_data' was never awaited code: CodeType = compile( RuntimeWarning: Enable tracemalloc to get the object allocation traceback ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str INFO: 127.0.0.1:38682 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [chat_utils.py:1233] An error occurred in `transformers` while applying chat template ERROR 09-14 13:08:40 [chat_utils.py:1233] Traceback (most recent call last): ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [chat_utils.py:1233] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [chat_utils.py:1233] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [chat_utils.py:1233] self.environment.handle_exception() ERROR 09-14 13:08:40 [chat_utils.py:1233] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [chat_utils.py:1233] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [chat_utils.py:1233] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [chat_utils.py:1233] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] Error in preprocessing prompt inputs ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1219, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] return tokenizer.apply_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1641, in apply_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat, generation_indices = render_jinja_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/transformers/utils/chat_template_utils.py", line 498, in render_jinja_template ERROR 09-14 13:08:40 [serving_chat.py:222] rendered_chat = compiled_template.render( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 1295, in render ERROR 09-14 13:08:40 [serving_chat.py:222] self.environment.handle_exception() ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/jinja2/environment.py", line 942, in handle_exception ERROR 09-14 13:08:40 [serving_chat.py:222] raise rewrite_traceback_stack(source=source) ERROR 09-14 13:08:40 [serving_chat.py:222] File "<template>", line 2, in top-level template code ERROR 09-14 13:08:40 [serving_chat.py:222] TypeError: can only concatenate str (not "list") to str ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] The above exception was the direct cause of the following exception: ERROR 09-14 13:08:40 [serving_chat.py:222] ERROR 09-14 13:08:40 [serving_chat.py:222] Traceback (most recent call last): ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 205, in create_chat_completion ERROR 09-14 13:08:40 [serving_chat.py:222] ) = await self._preprocess_chat( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 813, in _preprocess_chat ERROR 09-14 13:08:40 [serving_chat.py:222] request_prompt = apply_hf_chat_template( ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/utils/__init__.py", line 1292, in inner ERROR 09-14 13:08:40 [serving_chat.py:222] return fn(*args, **kwargs) ERROR 09-14 13:08:40 [serving_chat.py:222] ^^^^^^^^^^^^^^^^^^^ ERROR 09-14 13:08:40 [serving_chat.py:222] File "/root/miniconda3/envs/gui/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1235, in apply_hf_chat_template ERROR 09-14 13:08:40 [serving_chat.py:222] raise ValueError(str(e)) from e ERROR 09-14 13:08:40 [serving_chat.py:222] ValueError: can only concatenate str (not "list") to str INFO: 127.0.0.1:38694 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request INFO: 127.0.0.1:38700 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request INFO: 127.0.0.1:38704 - "POST /v1/chat/completions HTTP/1.1" 400 Bad RequestReproduction
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openai
Environment
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openai
Error traceback
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m vllm.entrypoints.openai.api_server --model /root/autodl-tmp/models/InternVL3-78B --served-model-name qwen2.5-vl-72b-instruct --port=8005 --limit-mm-per-prompt image=3 --seed 0 --gpu-memory-utilization 0.92 --tensor-parallel-size 4 --max_model_len 25000 --trust-remote-code --chat-template-content-format openai
我和你的错误一样
可以告诉我,你的环境和机器吗?我用的A800,可能无法支持更高的CUDA版本,用不了最新的vllm,我的vllm版本是0.8.5
我用的 vLLM 0.10.2 + CUDA12.6 + 两张 T4 卡。vLLM 0.8.5 有点老了,而且我们之前在0.9版本左右修过一些 chat template 上的 bug。🥲
@YuanDaoze 我试了一下上面的 message,应该是 system prompt 这一 part 出问题了:
{
'role': 'system',
'content': 'You are good at reasoning and problem solving. You need to help user choose the best reasoning method and ONLY give back the choosen reasoning method, for example: "1 How could I devise an experiment to help solve that problem?'
},
去掉 system prompt 或者启动模型时加上 --chat-template-content-format string,模型都是能正常输出的。我感觉 InternVL3 model repo 里的 chat template 还是写的有点问题。🤔
{%- if messages[0]['role'] == 'system' %}{{- '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}
Openai 格式下 vLLM 输进去 chat template 的 messsage 大概长这样:
[{'role': 'system', 'content': [{'type': 'text', 'text': 'You are good at reasoning and problem solving.'}]}, {'role': 'user', 'content': [{'type': 'text', 'text': 'What are the animals in these images?'}, {'type': 'image'}, {'type': 'image'}]}]
但是 InternVL3 里的 system 部分是按照 string 格式写的:
{%- if messages[0]['role'] == 'system' %}{{- '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}
所以用 --chat-template-content-format openai 时, 一旦 message 里有 system prompt 的部分就会导致 list + str 的操作
Openai 格式下 vLLM 输进去 chat template 的 messsage 大概长这样:
[{'role': 'system', 'content': [{'type': 'text', 'text': 'You are good at reasoning and problem solving.'}]}, {'role': 'user', 'content': [{'type': 'text', 'text': 'What are the animals in these images?'}, {'type': 'image'}, {'type': 'image'}]}]但是 InternVL3 里的 system 部分是按照 string 格式写的:
{%- if messages[0]['role'] == 'system' %}{{- '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}所以用
--chat-template-content-format openai时, 一旦 message 里有 system prompt 的部分就会导致 list + str 的操作
谢谢,那么该如何解决这个问题呢?刚刚试了你的tokenizer的方法,但是模型似乎指令跟随能力下降了,无法输出
谢谢,那么该如何解决这个问题呢?刚刚试了你的tokenizer的方法,但是模型似乎指令跟随能力下降了,无法输出标签?感觉很奇怪
试试换成 --chat-template-content-format string?
@YuanDaoze 我试了一下上面的 message,应该是 system prompt 这一 part 出问题了:
{ 'role': 'system', 'content': 'You are good at reasoning and problem solving. You need to help user choose the best reasoning method and ONLY give back the choosen reasoning method, for example: "1 How could I devise an experiment to help solve that problem?' },去掉 system prompt 或者启动模型时加上
--chat-template-content-format string,模型都是能正常输出的。我感觉 InternVL3 model repo 里的 chat template 还是写的有点问题。🤔{%- if messages[0]['role'] == 'system' %}{{- '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}
谢谢!我明白了,感谢你的帮助,我后续再研究下chat-template,非常感谢!
@YuanDaoze 我试了一下上面的 message,应该是 system prompt 这一 part 出问题了:
{ 'role': 'system', 'content': 'You are good at reasoning and problem solving. You need to help user choose the best reasoning method and ONLY give back the choosen reasoning method, for example: "1 How could I devise an experiment to help solve that problem?' },去掉 system prompt 或者启动模型时加上
--chat-template-content-format string,模型都是能正常输出的。我感觉 InternVL3 model repo 里的 chat template 还是写的有点问题。🤔{%- if messages[0]['role'] == 'system' %}{{- '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}谢谢!我明白了,感谢你的帮助,我后续再研究下chat-template,非常感谢!
您好 我这边也碰到了一样的问题,38B可以部署但是78B不行。请问后续解决了吗