dify icon indicating copy to clipboard operation
dify copied to clipboard

The frontend cannot show the generated text by llm

Open zengqingfu1442 opened this issue 9 months ago • 2 comments

Self Checks

  • [X] This is only for bug report, if you would like to ask a quesion, please head to Discussions.
  • [X] I have searched for existing issues search for existing issues, including closed ones.
  • [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [X] Pleas do not modify this template :) and fill in all the required fields.

Dify version

0.6.6

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

I deplyed a local openai-api compatible server, but the frontend cannot show the generated text by llm. But from the logs of my local openai-api compatible server, i can see the generated text by llm which indicates that it realy generated text. See the right dialogue below. image

image

✔️ Expected Behavior

The frontend can show the text generated by llm.

❌ Actual Behavior

the frontend cannot show the generated text by llm

zengqingfu1442 avatar May 04 '24 06:05 zengqingfu1442

It looks like the issue with the frontend not displaying the generated text from your LLM might be due to how the Docker network configuration is set up, particularly regarding how your services communicate with each other. Here's a more streamlined approach to troubleshoot and potentially resolve this issue:

  1. Network Configuration: Ensure your LLM service is accessible from within Docker by binding it to 0.0.0.0 instead of localhost or 127.0.0.1. This makes the service available on your local network.

  2. Correct IP Address: In your Dify configuration, use the actual IP address of the machine where the LLM service is running instead of localhost. This could be the local network IP (e.g., 192.168.x.x) or the Docker host IP if you're running everything on the same machine but in different containers.

  3. Check Dify Frontend Configurations: Verify that the model name, base URL, and other relevant settings in Dify are correctly configured to match your LLM service. This includes setting the correct model type and ensuring the base URL points to where your LLM service is accessible.

  4. Review Logs for Specific Errors: If the problem persists, check the frontend logs for any specific error messages that could give more insight into the issue.

By following these steps, you should be able to diagnose and fix the problem with the frontend not displaying the generated text. If you encounter any specific errors or need further assistance, feel free to ask.

Details

Sources


About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

dosubot[bot] avatar May 04 '24 06:05 dosubot[bot]

My local openai-api-compatible server only implements the /v1/models and /v1/chat/completions apis. And the /v1/chat/completions api also passed the integration test dify/api/tests/integration_tests/model_runtime/openai_api_compatible/test_llm.py at 0.6.4 · langgeni

zengqingfu1442 avatar May 04 '24 06:05 zengqingfu1442