dify icon indicating copy to clipboard operation
dify copied to clipboard

Plugin-Expansion “OpenAI Compatible Dify App” inaccessible

Open zuohaoda opened this issue 10 months ago • 6 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.0.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Plugin-Expansion inaccessible OpenAI Compatible Dify App

How to change the generated address is localhost? http://localhost/e/mgwantstwbg1k49c/chat/completions I need to access it from an external network How do I write the model name through Open WebUl, Endpoint Name?

Image

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

zuohaoda avatar Mar 13 '25 02:03 zuohaoda

To access the "OpenAI Compatible Dify App" from an external network, you need to change the generated address from localhost to your host's IP address. For example, if your host's IP address on the local network is 192.168.1.100, you should use http://192.168.1.100:8080 as the server URL. This change allows external devices on the same network to access the app.

For specifying the model name through the Open WebUI and Endpoint Name, you can configure it in Dify as follows:

  1. For the LLM model ggml-gpt4all-j:

    • Model Type: Text Generation
    • Model Name: gpt-3.5-turbo (this name can be customized in models/gpt-3.5-turbo.yaml)
    • Server URL: Use http://<your-LocalAI-endpoint-domain>:8080, replacing <your-LocalAI-endpoint-domain> with your host's IP address.
  2. For the embedding model all-MiniLM-L6-v2:

    • Model Type: Embedding
    • Model Name: text-embedding-ada-002 (this name can be customized in models/embeddings.yaml)
    • Server URL: Again, use http://<your-LocalAI-endpoint-domain>:8080, replacing <your-LocalAI-endpoint-domain> with your host's IP address.

Ensure that the network configuration allows the Dify container to access the LocalAI endpoint by using the host IP address instead of localhost [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 13 '25 02:03 dosubot[bot]

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.[x] I have searched for existing issues search for existing issues, including closed ones.[x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).[x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)[x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.0.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Plugin-Expansion inaccessible OpenAI Compatible Dify App

How to change the generated address is localhost? I need to access it from an external network How do I write the model name through Open WebUl, Endpoint Name?http://localhost/e/mgwantstwbg1k49c/chat/completions

Image

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

Hello, do you know how to configure it? I also encountered this problem.

bumblebee-code-gh avatar Mar 13 '25 09:03 bumblebee-code-gh

Hey, you did not follow the instructions in the GitHub issue template and you do not even describe a bug but ask for help. Nevertheless, you have created a bug report. Please contact the community using the specified method in the future.

Image

However, your question is about how to configure self-hosted instances in 1.0.x so plugin endpoints are reachable from the internet, right?

First, you need a public domain or IP address. And you need to setup your hosting so the requests to the domain / ip are forwarded to Dify's nginx container. This will forward requests to the plugin_daemon container.

If you have a public domain/ip, you want to configure it as ENDPOINT_URL_TEMPLATE, see here: https://github.com/langgenius/dify/blob/e796937d0233578abcd92dd2779a220a7b90e099/docker/.env.example#L971 and restart all Dify servcies to apply the changes.

Example: ENDPOINT_URL_TEMPLATE=https://dify.my-domain.com/e/{hook_id} Dify will then give you the correct url when you setup endpoints like https://dify.my-domain.com/e/mgwantstwbg1k49c/chat/completions

I hope that helps and please remember to ask questions via GitHub discussions and not via GitHub issues.

perzeuss avatar Mar 13 '25 20:03 perzeuss

endpoint name used as model name for openai client?

chenzikun avatar Mar 24 '25 03:03 chenzikun

more confused about this plugin? can add some doc about how to use it?

chenzikun avatar Mar 26 '25 06:03 chenzikun

@zuohaoda 请问一下你知道怎么使用这个插件吗,我对配置好了之后的地址感到困惑。

  1. openai proxy地址该填那个
  2. openai的model_name该如何写

chenzikun avatar Mar 26 '25 09:03 chenzikun

@zuohaoda 请问一下你知道怎么使用这个插件吗,我对配置好了之后的地址感到困惑。

  1. openai proxy地址该填那个
  2. openai的model_name该如何写

这里没什么要填的,请求的时候也要按照 openai 的格式去填,model 甚至都可以不传,message 历史自己拼, inputs里可以放你工作流自定义的参数,不支持conversation_id,

curl --location --request POST 'http://yourhost:port/e/xxxxxxxxxx/chat/completions' \
--header 'Authorization: Bearer dify-api-key' \
--header 'Content-Type: application/json' \
--data-raw '{
  "inputs": {
    "world_id": 23,      
  },
    "messages": [
    {
      "role": "user",
      "content": "有人吗"
    }
  ],
  "stream": true
}'

gongxiaokai avatar Jul 23 '25 04:07 gongxiaokai