[Bug]: Help needed - Local vllm deployment doesn't receive the request from desktop
Version
0.0.7
Model
UI-TARS-7B-SFT
Deployment Method
Local
Issue Description
Hi, want some help to run UI-TARS-desktop for local deployment. I have successfully launched the vllm service and the service should work well since I can connect it from open-webui.
But when I use the desktop exe (0.0.7 version, don't know how to install the latest version as there is no exe..), after I input the prompt, the vllm side doesn't get any response. Can anyone guide me if there's config or anything else that I'm doing wrong? Or how to get the logs from the frontend to detect the issue? Thanks!
Error Logs
No response
Thanks for your feedback, which deployment tool do you use in local?
Thanks for your feedback, which deployment tool do you use in local?
Hi, I'm using vllm and I suppose the vllm connection should be OK since I can get response from open-webui.
But for UI-TARS-Desktop, the vllm backend doesn't receive any request. Not sure if I'm doing anything wrong. Hope to get some help or guidance, thanks!
More update here: https://github.com/bytedance/UI-TARS?tab=readme-ov-file#start-an-openai-api-service I can also successfully run the code using OpenAI API with my vllm deployment. The issue should be in the desktop app, I'm using 0.0.7, not sure if it has some issues, for more recent versions, there's no documentation how to install it. Related issue: https://github.com/bytedance/UI-TARS-desktop/issues/366
Thank you for your reply! It is worth noting that we have accumulated a large number of issues and we are also working hard to pay attention to them.
Regarding the problem you reported, We have these two solutions to help you try to solve it yourself first:
- Solution 1: Use the UI-TARS model deployed by Cloud for testing. The stability of local
UI-TARS-7B-SFTdeployment is not 100% Stable; - Solution 2: Try to
debugGUIAgent by yourself in combination with our contribution guide.
If you still encounter any problems, please feel free to continue communicating.
More update here: https://github.com/bytedance/UI-TARS?tab=readme-ov-file#start-an-openai-api-service I can also successfully run the code using OpenAI API with my vllm deployment. The issue should be in the desktop app, I'm using 0.0.7, not sure if it has some issues, for more recent versions, there's no documentation how to install it. Related issue: #366
v0.0.7 is truly the latest version of UI TARS Desktop, the issue you see in https://github.com/bytedance/UI-TARS-desktop/issues/366 is about our new App Agent TARS, there are some differences between the two, you need to first clarify what you want
Sure, I will follow your suggestions. Thanks for your response!
Use the source code to compile the local vLLM. You can directly open it without calling the vLLM interface. Just open it with administrator privileges. That's what I did.
Hi @daker11123 Thanks for your reply! Could you provide more steps/details how you do it? Are you following this guide: https://github.com/bytedance/UI-TARS-desktop/blob/main/CONTRIBUTING.md ? As I understand this only builds the GUI and vllm is started separately. How do you compile the local vLLM? Thanks!
@hkvision 我是直接git clone当前主分支自行编译的ui-tars 具体命令为pnpm run build,vllm环境使用docker启动,参考:https://docs.vllm.ai/en/latest/deployment/docker.html。 编译之后用管理员身份运行。 配置如下