VisualGLM-6B
VisualGLM-6B copied to clipboard
Issue with API mode: unexpected keyword argument 'mems'
I'm trying to run the API mode. Copied model data from hugging face. Added the following to the api.py:
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("data", trust_remote_code=True)
model = AutoModel.from_pretrained("data", trust_remote_code=True).half().cuda()
model = model.eval()
app = FastAPI()
*all HF model files are in local ./data/
after running the server, request it from curl:
curl -X POST -H "Content-Type: application/json" -d @temp.json http://127.0.0.1:8080
here's the error I got when trying to submit a sample request:
INFO: 127.0.0.1:35234 - "POST / HTTP/1.1" 500 Internal Server Error
Internal Server Errorroot@291f83eb6f53:/VisualGLM-6B# ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/uvicorn/protocols/http/h11_impl.py", line 428, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/usr/local/lib/python3.10/dist-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/fastapi/applications.py", line 276, in __call__
await super().__call__(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/starlette/applications.py", line 122, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/errors.py", line 184, in __call__
raise exc
File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/exceptions.py", line 79, in __call__
raise exc
File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/exceptions.py", line 68, in __call__
await self.app(scope, receive, sender)
File "/usr/local/lib/python3.10/dist-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
raise e
File "/usr/local/lib/python3.10/dist-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
await self.app(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 718, in __call__
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 66, in app
response = await func(request)
File "/usr/local/lib/python3.10/dist-packages/fastapi/routing.py", line 237, in app
raw_response = await run_endpoint_function(
File "/usr/local/lib/python3.10/dist-packages/fastapi/routing.py", line 163, in run_endpoint_function
return await dependant.call(**values)
File "/VisualGLM-6B/api.py", line 36, in visual_glm
answer, history, _ = chat(None, model, tokenizer, input_text, history=history, image=input_image, \
File "/VisualGLM-6B/model/chat.py", line 141, in chat
output = filling_sequence(
File "/usr/local/lib/python3.10/dist-packages/sat/generation/autoregressive_sampling.py", line 108, in filling_sequence
logits, *output_per_layers = model(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
TypeError: ChatGLMForConditionalGenerationWithImage.forward() got an unexpected keyword argument 'mems'
What did I do wrong? How can I get the API up and running?
Thanks
这个接口是sat模型的, 你可以回滚更改的代码,让它自动下载sat格式的模型。如果要使用Huggingface格式的模型,请等待我们之后的更新(或者你看一下pull request里那个,我还没来得及验证)。
不改代码的话,它并不自动下载模型,只会报错,model没有define。
我刚测试了一下,确实会自动下载。请确认代码是目前仓库的latest版本。
@git4sun