CogAgent
CogAgent copied to clipboard
vllm_openai_server.py crash
运行vllm_openai_server.py:
我的环境:
我在使用vllm启动cogagent时遇到了同样的问题: "ValueError: ChatGLMForConditionalGeneration has no vLLM implementation and the Transformers implementation is not compatible with vLLM." 我的环境是: python3.10.12 vllm0.7.3+empty torch2.5.1 transformers4.50.1
这个project 有人维护吗?