dify icon indicating copy to clipboard operation
dify copied to clipboard

ConnectionError with get_llm_num_tokens

Open caiqfrog opened this issue 9 months ago • 2 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.0.0

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

env: dify-1.0.0 with plugin ollama-0.0.3 ollama-0.5.11 with deepseek-r1:8b

when i create a chatbot, then debug & preview dify-api throw exception, logs follows

`

2025/03/02 00:06:53 stdout 2025-03-01 16:06:53.006 ERROR [Dummy-9] [base_app_generate_response_converter.py:123] - {"args":{},"error_type":"ConnectionError","message":"('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))"}
2025/03/02 00:06:53 stdout core.plugin.manager.exc.PluginInvokeError: PluginInvokeError: {"args":{},"error_type":"ConnectionError","message":"('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))"}
2025/03/02 00:06:53 stdout raise PluginInvokeError(description=message)
2025/03/02 00:06:53 stdout File "/app/api/core/plugin/manager/base.py", line 221, in _handle_plugin_daemon_error
2025/03/02 00:06:53 stdout self._handle_plugin_daemon_error(error.error_type, error.message)
2025/03/02 00:06:53 stdout File "/app/api/core/plugin/manager/base.py", line 189, in _request_with_plugin_daemon_response_stream
2025/03/02 00:06:53 stdout ^^^^^^^^
2025/03/02 00:06:53 stdout for resp in response:
2025/03/02 00:06:53 stdout File "/app/api/core/plugin/manager/model.py", line 231, in get_llm_num_tokens
2025/03/02 00:06:53 stdout ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025/03/02 00:06:53 stdout return plugin_model_manager.get_llm_num_tokens(
2025/03/02 00:06:53 stdout File "/app/api/core/model_runtime/model_providers/__base/large_language_model.py", line 299, in get_num_tokens
2025/03/02 00:06:53 stdout ^^^^^^^^^^^^^^^^^^^^^^^^^
2025/03/02 00:06:53 stdout return function(*args, **kwargs)
2025/03/02 00:06:53 stdout File "/app/api/core/model_manager.py", line 370, in _round_robin_invoke
2025/03/02 00:06:52 stdout self._round_robin_invoke(
2025/03/02 00:06:52 stdout File "/app/api/core/model_manager.py", line 195, in get_llm_num_tokens
2025/03/02 00:06:52 stdout ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025/03/02 00:06:52 stdout prompt_tokens = model_instance.get_llm_num_tokens(prompt_messages)
2025/03/02 00:06:52 stdout File "/app/api/core/app/apps/base_app_runner.py", line 86, in get_pre_calculate_rest_tokens
2025/03/02 00:06:52 stdout self.get_pre_calculate_rest_tokens(
2025/03/02 00:06:52 stdout File "/app/api/core/app/apps/chat/app_runner.py", line 58, in run
2025/03/02 00:06:52 stdout runner.run(
2025/03/02 00:06:52 stdout File "/app/api/core/app/apps/chat/app_generator.py", line 223, in _generate_worker
2025/03/02 00:06:52 stdout Traceback (most recent call last):
2025/03/02 00:06:52 stdout 2025-03-01 16:06:52.998 ERROR [Thread-8 (_generate_worker)] [app_generator.py:243] - Unknown Error when generating
`

The ollama interface is fine as I have tested it with other clients

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

caiqfrog avatar Mar 01 '25 16:03 caiqfrog

@caiqfrog Hello there! I'm here to help you with bug reports, answer your questions, and assist you in becoming a contributor. Let me know how I can assist you today!

To resolve the ConnectionError with get_llm_num_tokens when using Dify 1.0.0 with the Ollama plugin in a self-hosted Docker environment, you should:

  1. Replace your local IP address (e.g., 192.168.xx.xx:11434) with http://host.docker.internal:11434 in the Ollama configuration. This change allows Docker containers to access services running on the host machine.

  2. Ensure that the OLLAMA_HOST environment variable is set to 0.0.0.0 to make Ollama accessible from outside the container. Depending on your operating system, you may need to set this environment variable using launchctl on macOS, systemctl on Linux, or through the system environment variables on Windows.

  3. If you are using network tools like Clash, enabling global TUN mode in Clash can also resolve network connectivity issues with the Ollama plugin [1][2][3].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 01 '25 16:03 dosubot[bot]

same

opsvr202 avatar Mar 02 '25 01:03 opsvr202

See https://github.com/langgenius/dify-official-plugins/issues?q=is%3Apr+author%3AJohnJyong

crazywoola avatar Mar 02 '25 02:03 crazywoola