[Bug] [RejectInvalidRequests] Skipping invalid infer request
Checklist
- [X] 1. I have searched related issues but cannot get the expected help.
- [X] 2. The bug has not been fixed in the latest version.
- [X] 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
follow the doc and run the hello.py, got this error attached is the log file error.log Do I need more horse power? I am running on a laptop with nv 3050 Ti
Reproduction
from lmdeploy import pipeline, TurbomindEngineConfig, ChatTemplateConfig from lmdeploy.vl import load_image
model = 'OpenGVLab/InternVL2-2B' system_prompt = '我是书生·万象,英文名是InternVL' image = load_image('https://raw.githubusercontent.com/open-mmlab/mmdeploy/main/tests/data/tiger.jpeg') chat_template_config = ChatTemplateConfig('internvl-internlm2') chat_template_config.meta_instruction = system_prompt pipe = pipeline(model, chat_template_config=chat_template_config, backend_config=TurbomindEngineConfig(session_len=8192), log_level='INFO')
response = pipe(('describe this image', image)) print(response.text)
Environment
running wsl under windows 11, ubuntu 24.04 with anacondo
Error traceback
No response