llm-graph-builder icon indicating copy to clipboard operation
llm-graph-builder copied to clipboard

A bug in the request for populate_graphic_schema

Open Super-six-java opened this issue 1 year ago • 2 comments

When I am using the local ollama Qwen2:72b model, I encountered an error while requesting Populate_graphic_stemplate

2024-09-20 09:54:15,056 - Exception in getting the schema from text:Received unsupported arguments {'method': 'function_calling'}
Traceback (most recent call last):
  File "/data/workspace/neo4j/llm-graph-builder/backend/score.py", line 539, in populate_graph_schema
    result = populate_graph_schema_from_text(input_text, model, is_schema_description_checked)
  File "/data/workspace/neo4j/llm-graph-builder/backend/src/main.py", line 568, in populate_graph_schema_from_text
    result = schema_extraction_from_text(text, model, is_schema_description_cheked)
  File "/data/workspace/neo4j/llm-graph-builder/backend/src/shared/schema_extraction.py", line 40, in schema_extraction_from_text
    runnable = prompt | llm.with_structured_output(
  File "/home/ubuntu/anaconda3/envs/neo4j/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 1232, in with_structured_output
    raise ValueError(f"Received unsupported arguments {kwargs}")
ValueError: Received unsupported arguments {'method': 'function_calling'}

Super-six-java avatar Sep 20 '24 02:09 Super-six-java

@aashipandya

kartikpersistent avatar Sep 20 '24 08:09 kartikpersistent

There is an issue with function calling of LLama models, working on fixing those. Will update here when fix will be there in dev.

aashipandya avatar Sep 20 '24 11:09 aashipandya

you can replace following line in llm.py from langchain_community.chat_models import ChatOllama to from langchain_ollama import ChatOllama (you will need to install langchain_ollama latest version )

kaustubh-darekar avatar Apr 08 '25 09:04 kaustubh-darekar