Pao Sheng
Pao Sheng
Hey @yuzhi-jiang, if the files are the same, I’m not sure why the error occurred. Maybe you could try stopping the container and restarting it again? Thanks for the suggestion!...
Hi @yuzhi-jiang, I got a same issue with you from other community member. Can you give Wren AI 0.15.4 a try? We just released it this week, and I’ve been...
> > LGTM! leave a question comment only. > > if chart is invalid, the chart type is empty string can you mention the thing in codebase, i think it...
Hi @Archilht, did you solve this issue? If not, you might be able to refer to my example for Ollama config for llm and embedder section. ```yaml models: - api_base:...
Hi @YISIO, if you’re using version 0.15.3, could you please check out this link: https://github.com/Canner/WrenAI/blob/4d6c82ca69985a7d38270b7a4b815b2276f1fd9c/wren-ai-service/docs/config_examples/config.deepseek.yaml#L96-L97 and fill the missing in. We removed those pipeline settings in version 0.15.4. Thanks a...
> [@paopa](https://github.com/paopa) [@onlyjackfrost](https://github.com/onlyjackfrost) thanks,the container has successfully started with new config.yaml,but when I ask the question, it error,the log: > > During handling of the above exception, another exception occurred:...
Hi @YISIO, thank you for providing the scenario. Could you also dump the log after performing these actions, along with the config and .env file? Thanks! BTW, we released 0.15.4...
Hi @jianwang770509, thank you for creating the issue. Can you provide more info? It will be good for us to trace the issue. Like how do you start Wren AI?...
Hi @Ahaha1998, I noticed the error message in your log. ``` E0303 01:52:47.722 8 wren-ai-service:132] Request fe50d01e-1cc2-4c18-878a-9461dd3b5245: Error validating question: litellm.APIError: APIError: DeepseekException - Unable to get json response -...
Hi @Ahaha1998, If it's possible, can you integrate with https://langfuse.com/ (cloud or self-host both okay)? Just put the Langfuse key in the .env file and then you can see more...