llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Compile bug: Error while compile the llama.cpp - libggml-cuda.so: undefined reference to `log2f@GLIBC_2.27'

Open sakshi-joshi-handle opened this issue 1 week ago • 0 comments

Problem Description

Hi is there any support to openai api capability support provide by vllm i want test some models with browser use like qwen-vl model the only way i found os inference with vlm models vllm serve and connect browser-use to open it currenly after few step i get error like this Attempted to assign 1794 = 1794 multimodal tokens to 0 placeholders and vllm crash best regards

Proposed Solution

add browser-use support openapi capability support provide by vllm

Alternative Solutions

No response

Additional Context

No response

sakshi-joshi-handle avatar Feb 14 '25 20:02 sakshi-joshi-handle