CogVLM
CogVLM copied to clipboard
Can I run CogVLM using actual openai API
Is it possible to run CogVLM with the actual openai API (client.chat.completions.create)? If this is not possible, how do we run CogVLM through an A100 server using the openai demo provided in this repo?
Our team looked at the openai_api_request.py, and it isn't clear to us how we would keep CogVLM running indefinitely on the A100 server so that we can communicate with CogVLM via the baseurl endpoint we would create for the server. If someone could please explain this, it would be well appreciated.
你只要启动api server.py就行,
@zRzRzRzRzRzRzR I don't see any script called api server.py in this repo.
https://github.com/THUDM/CogVLM/blob/main/openai_demo/openai_api.py try this