ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Bug] Using gemini-1.5-flash-lastest as Chat model throws error with knowledge graph creation

Open marcfon opened this issue 4 months ago • 2 comments

Is there an existing issue for the same bug?

  • [X] I have checked the existing issues.

Branch name

main

Commit ID

--

Other environment information

No response

Actual behavior

Traceback (most recent call last):
  File "/ragflow/graphrag/graph_extractor.py", line 128, in __call__
    result, token_count = self._process_document(text, prompt_variables)
  File "/ragflow/graphrag/graph_extractor.py", line 177, in _process_document
    if response.find("**ERROR**") >=0: raise Exception(response)
Exception: **ERROR**: 400 Please use a valid role: user, model.
**ERROR**: contents must not be empty

Expected behavior

No response

Steps to reproduce

1. Set `Model Providers > System Model Settings > Chat model` to `gemini-1.5-flash-lastest`
2. Create a knowledge graph 
3. Process a file

Additional information

Processing the file works fine when the Chat model is set to gpt-4o-mini.

marcfon avatar Oct 05 '24 17:10 marcfon