chat2graph icon indicating copy to clipboard operation
chat2graph copied to clipboard

[Bug] : Error info: The job decomposition result is empty after retry.

Open jiefei30 opened this issue 4 months ago • 3 comments

environment : macos Python 3.10.18 node.js v22.18.0

I followed the Quickstart.md to build Chat2Graph and opened the web successfully. I added neo4j db which contains some vertex and edge. I try to ask 'tell me the upstream Column of 'val_main' Column', then response :

An error occurred during the execution of the job:

The job fb8966ad-3f40-42e2-a23d-9abe82c7f544 could not be decomposed correctly after retry. Please try again.
Error info: The job decomposition result is empty after retry.

Check the error details in path: '/Users/xxx/.chat2graph/logs/server.log'

Please check the job fb8966ad-3f40-42e2-a23d-9abe82c7f544 ("tell me th...") for more details. Or you can re-try to send your message.

.env config:

MODEL_PLATFORM_TYPE="LITELLM"  # Choose "LITELLM" or "AISUITE"

# for more info about LLM and embedding models, please refer to doc: doc/en-us/deployment/config-env.md
LLM_NAME=openai/XXX
LLM_ENDPOINT=http://XXX/v1
LLM_APIKEY=XXX

EMBEDDING_MODEL_NAME=text-embedding-3-large
EMBEDDING_MODEL_ENDPOINT=http://XXX/v1/embeddings
EMBEDDING_MODEL_APIKEY=XXX

TEMPERATURE=0
MAX_TOKENS=8192 # required by DeepSeek-V3
PRINT_REASONER_MESSAGES=1
PRINT_SYSTEM_PROMPT=1

LANGUAGE=en-US

server.log:

/opt/anaconda3/envs/chat2graph_env/lib/python3.10/site-packages/pydantic/main.py:463: UserWarning: Pydantic serializer warnings:
  PydanticSerializationUnexpectedValue(Expected 9 fields but got 6: Expected `Message` - serialized value may not be as expected [input_value=Message(content="\nSure! ...haring their query.\n"}), input_type=Message])
  PydanticSerializationUnexpectedValue(Expected `StreamingChoices` - serialized value may not be as expected [input_value=Choices(finish_reason='st...ider_specific_fields={}), input_type=Choices])
  return self.__pydantic_serializer__.to_python(
127.0.0.1 - - [05/Aug/2025 14:03:50] "GET /api/jobs/0e7be8ac-6fd6-428c-9839-5ced358f28d0/message HTTP/1.1" 200 -
127.0.0.1 - - [05/Aug/2025 14:03:53] "GET /api/jobs/0e7be8ac-6fd6-428c-9839-5ced358f28d0/message HTTP/1.1" 200 -
127.0.0.1 - - [05/Aug/2025 14:03:56] "GET /api/jobs/0e7be8ac-6fd6-428c-9839-5ced358f28d0/message HTTP/1.1" 200 -

The above logs are printed in a loop. Even when I asked some basic questions, such as "What is a graph?", the same thing happened.

jiefei30 avatar Aug 06 '25 03:08 jiefei30

Hi @jiefei30 , I need more information from the server.log (the log information you provided only comes from a Litellm warning, which is a common warning but does not affect the existing project), and I would also like to know what LLM_NAME and LLM_ENDPOINT you are using (make sure your large model configuration is correct). You can curl it?

Follow doc/en-us/deployment/config-env.md or doc/zh-cn/deployment/config-env.md to configure the llm.

Appointat avatar Aug 07 '25 03:08 Appointat

127.0.0.1 - - [12/Aug/2025 20:20:40] "GET /api/jobs/d147ae5e-71a7-44cb-bedb-1fb6c07c9f26/message HTTP/1.1" 200 -
No relevant docs were retrieved using the relevance score threshold 0.3


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.

[WARNING]: Initial decomposition failed or validation error: litellm.APIError: APIError: OpenAIException - 

and 127.0.0.1 - - [12/Aug/2025 20:21:37] "GET /api/jobs/d147ae5e-71a7-44cb-bedb-1fb6c07c9f26/message HTTP/1.1" 200 - in a loop

Sorry I can't tell you the name of the model (XXX), because this is a private third-party model of our company, but I can guarantee that I can call the model through the postman @Appointat

jiefei30 avatar Aug 12 '25 12:08 jiefei30

I think the .env config you are calling Litellm is incorrect. If you are using a private model, you should use the local model invocation method: doc/zh-cn/deployment/config-env.md and https://docs.litellm.ai/docs/providers/vllm (self-hosted mode).​

BTW, Chat2Graph calls litellm source code: app/plugin/lite_llm/lite_llm_client.py @jiefei30

Appointat avatar Aug 12 '25 12:08 Appointat