CogVLM icon indicating copy to clipboard operation
CogVLM copied to clipboard

Verification of usage of roboflow to run cogVLM

Open PhilipAmadasun opened this issue 1 year ago • 5 comments

Is this link a legitimate source on how to use cogVLM via roboflow?

PhilipAmadasun avatar Mar 10 '24 04:03 PhilipAmadasun

no, cogvlm used and trained in SAT formated and not support finetune through hf and peft. So It will cause some problem using roboflow.

zRzRzRzRzRzRzR avatar Mar 19 '24 04:03 zRzRzRzRzRzRzR

@zRzRzRzRzRzRzR Thanks for the reply Is there some way to run cogVLM on an A100 and call it through some Restful API or something? I'm not interested in using the webUI.

PhilipAmadasun avatar Mar 19 '24 17:03 PhilipAmadasun

try use openai demo in this repos

zRzRzRzRzRzRzR avatar Mar 20 '24 06:03 zRzRzRzRzRzRzR

We will look into this next week, again thank you. Please keep this issue open for now.

PhilipAmadasun avatar Mar 20 '24 18:03 PhilipAmadasun

@zRzRzRzRzRzRzR We looked at your openai demo. Just to make sure we understand what is going on, your code does not actually using the openai API. You just developed code that works like the openai API but is not actually using the openai API, correct?

What we are currently trying to do

We have been trying to actually use the actual python openai API to run the cogVLM we pull from huggingface, would that work or are we going in the wrong direction?

Alternatives

If the proposed plan won't work, what do we have to modify in this code beside the baseurl to run cogVLM from a server? And how would we keep the cogVLM running on the server perpetually so we can run inference at any time?

If questions are confusing please let me know so I can clarify.

PhilipAmadasun avatar Mar 30 '24 18:03 PhilipAmadasun