WrenAI icon indicating copy to clipboard operation
WrenAI copied to clipboard

public deepseek: DeepseekException - Unable to get json response - Expecting value: line 1 column 1 (char 0)

Open Ahaha1998 opened this issue 9 months ago • 4 comments

Describe the bug A clear and concise description of what the bug is.

To Reproduce Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior A clear and concise description of what you expected to happen.

Screenshots If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]

Wren AI Information

  • Version: [e.g, 0.1.0]

Additional context Add any other context about the problem here.

Relevant log output

  • Please share config.yaml with us, it should be located at ~/.wrenai/config.yaml.
  • Please share your logs with us with the following command:
    docker logs wrenai-wren-ui-1 >& wrenai-wren-ui.log && \
    docker logs wrenai-wren-ai-service-1 >& wrenai-wren-ai-service.log && \
    docker logs wrenai-wren-engine-1 >& wrenai-wren-engine.log && \
    docker logs wrenai-ibis-server-1 >& wrenai-ibis-server.log
    

Ahaha1998 avatar Mar 03 '25 02:03 Ahaha1998

config.yaml (2).txt

env.txt

wrenai-wren-ai-service-1.txt

As the title shows, I am using the open version of deepseek and encountering an error

Ahaha1998 avatar Mar 03 '25 02:03 Ahaha1998

Hi @Ahaha1998, I noticed the error message in your log.

E0303 01:52:47.722 8 wren-ai-service:132] Request fe50d01e-1cc2-4c18-878a-9461dd3b5245: Error validating question: litellm.APIError: APIError: DeepseekException - Unable to get json response - Expecting value: line 1 column 1 (char 0), Original Response: 

I think it is due to the ability of the LLM you used to output the JSON format, then caused by the fact that Wren AI can't execute the next step.

paopa avatar Mar 03 '25 10:03 paopa

Hi @Ahaha1998, If it's possible, can you integrate with https://langfuse.com/ (cloud or self-host both okay)? Just put the Langfuse key in the .env file and then you can see more detail for the execution on LangFuse. And it also can help me to understand the issue. Thank you!

LANGFUSE_SECRET_KEY="sk-xxx"
LANGFUSE_PUBLIC_KEY="pk-xxx"

If you self-host Langfuse, you need to change the host in config.yaml.

  langfuse_host: https://cloud.langfuse.com <--
  langfuse_enable: true
  logging_level: DEBUG

paopa avatar Mar 04 '25 11:03 paopa

Hi @Ahaha1998, other ppl also encountered the same issue for json decode error; you can check the comment in https://github.com/Canner/WrenAI/issues/1354#issuecomment-2704457311

paopa avatar Mar 06 '25 17:03 paopa