[Question]: bug in demo.ragflow.io/v1/api/completion
Describe your problem
I have tried with at least three different rest clients but the call with the API to demo.ragflow.io/v1/api/completion always gets the following response:
Instead, the chat in the demo works correctly.
What kind of LLM did you adapt?
refer:https://github.com/infiniflow/ragflow/issues/1690 I guess it's a problem that needs to be fixed
What kind of LLM did you adapt?
I mean the LLM when chat not embedding model.
What kind of LLM did you adapt?
I mean the LLM when chat not embedding model.
What kind of LLM did you adapt?
I don't know, I try to use the demo ( https://demo.ragflow.io/ )
P.S.: your image is not visible...
