mac0q

Results 5 comments of mac0q

@zsb87 It appears that your local model or API is refusing to respond. Usually this is because the model has limited functionality. Can you tell me your model version?

@zsb87 I think llava:7b is still weak for this task, we will try it optimize the prompt to make it doable, but GPT-4V is for sure the best choice.

Can you show us the configuration so we can reproduce it?

Gotcha. But it seems to be a little different from the description of the demo. https://github.com/THUDM/CogVLM/tree/main#:~:text=The%20program%20will%20automatically%20download%20the%20sat%20model%20and%20interact%20in%20the%20command%20line

@linmiao5 At the beginning, llava:7b was used to have a taste on UFO, but due to its own context length limitation, it needs to satisfy the input restriction of llava...