CogVLM
CogVLM copied to clipboard
Verification of usage of roboflow to run cogVLM
no, cogvlm used and trained in SAT formated and not support finetune through hf and peft. So It will cause some problem using roboflow.
@zRzRzRzRzRzRzR Thanks for the reply Is there some way to run cogVLM on an A100 and call it through some Restful API or something? I'm not interested in using the webUI.
try use openai demo in this repos
We will look into this next week, again thank you. Please keep this issue open for now.
@zRzRzRzRzRzRzR We looked at your openai demo. Just to make sure we understand what is going on, your code does not actually using the openai API. You just developed code that works like the openai API but is not actually using the openai API, correct?
What we are currently trying to do
We have been trying to actually use the actual python openai API to run the cogVLM we pull from huggingface, would that work or are we going in the wrong direction?
Alternatives
If the proposed plan won't work, what do we have to modify in this code beside the baseurl to run cogVLM from a server? And how would we keep the cogVLM running on the server perpetually so we can run inference at any time?
If questions are confusing please let me know so I can clarify.