neo4j-graphrag-python icon indicating copy to clipboard operation
neo4j-graphrag-python copied to clipboard

neo4j_graphrag.exceptions.SchemaExtractionError: LLM response is not valid JSON when using VertexAI with Simple KGPipeline

Open xpilasneo4j opened this issue 4 months ago • 1 comments

Similar to https://github.com/neo4j/neo4j-graphrag-python/issues/376, I have an exception when loading data using VertexAI about the return type neo4j_graphrag.exceptions.SchemaExtractionError: LLM response is not valid JSON.

I fixed it by adding in line 98 on the vertexai_llm.py self.response_mime_type = "application/json" But I guess it will have to come from the object model_params and for each LLM be translated into the correct options

xpilasneo4j avatar Aug 20 '25 07:08 xpilasneo4j

Hi Xavier,

Like for OllamaLLM, you need to configure the response format when initializing the LLM instance. For VertexAILLM, it is:

from vertexai.generative_models import GenerationConfig

llm = VertexAILLM(
    model_name=model_name,
    generation_config=GenerationConfig(
        temperature=0.0,
        response_mime_type="application/json",
    )
)

stellasia avatar Aug 25 '25 08:08 stellasia