neo4j-graphrag-python
neo4j-graphrag-python copied to clipboard
neo4j_graphrag.exceptions.SchemaExtractionError: LLM response is not valid JSON when using VertexAI with Simple KGPipeline
Similar to https://github.com/neo4j/neo4j-graphrag-python/issues/376, I have an exception when loading data using VertexAI about the return type
neo4j_graphrag.exceptions.SchemaExtractionError: LLM response is not valid JSON.
I fixed it by adding in line 98 on the vertexai_llm.py
self.response_mime_type = "application/json"
But I guess it will have to come from the object model_params and for each LLM be translated into the correct options
Hi Xavier,
Like for OllamaLLM, you need to configure the response format when initializing the LLM instance. For VertexAILLM, it is:
from vertexai.generative_models import GenerationConfig
llm = VertexAILLM(
model_name=model_name,
generation_config=GenerationConfig(
temperature=0.0,
response_mime_type="application/json",
)
)