[Question]:Exception(f"Not supported doc engine: {DOC_ENGINE}……)
Describe your problem
When I use 'docker logs -f ragflow-server', the logs display the content of Exception(f"Not supported doc engine: {DOC_ENGINE}……). How can I solve this problem?
added:
2025-02-17T12:31:47.185980574Z raise Exception(f"Not supported doc engine: {DOC_ENGINE}")
2025-02-17T12:31:47.185982220Z Exception: Not supported doc engine: elasticRagsearch
2025-02-17T12:31:49.687340459Z 2025-02-17 20:31:49,685 INFO 6258 ragflow_server log path: /ragflow/logs/ragflow_server.log, log levels: {'peewee': 'WARNING', 'pdfminer': 'WARNING', 'root': 'INFO'}
2025-02-17T12:31:50.694437488Z 2025-02-17 20:31:50,693 INFO 6184 TextRecognizer det uses CPU
2025-02-17T12:31:50.784540385Z 2025-02-17 20:31:50,783 INFO 6184 TextRecognizer rec uses CPU
2025-02-17T12:31:50.807005816Z 2025-02-17 20:31:50,805 INFO 6184
2025-02-17T12:31:50.807032089Z ______ __ ______ __
2025-02-17T12:31:50.807034201Z /_ /_ / / / / _____ _______ / / _____
2025-02-17T12:31:50.807035792Z / / / __ `/ / /// / __/ | |// _ / / / / / __/ __ / /
2025-02-17T12:31:50.807037694Z / / / // ( ) ,< / /> </ / // // / // // / /
2025-02-17T12:31:50.807058989Z // _,///|| /_____//||_/_/_,/_/___//
2025-02-17T12:31:50.807060828Z
2025-02-17T12:31:50.807864596Z 2025-02-17 20:31:50,806 INFO 6184 TaskExecutor: RAGFlow version: v0.16.0-50-g6daae7f2 full
2025-02-17T12:31:50.808758519Z Traceback (most recent call last):
2025-02-17T12:31:50.808780148Z File "/ragflow/rag/svr/task_executor.py", line 737, in
In docker/.env, what's the value of DOC_ENGINE?
Keep it consistant with the code repo.
In
docker/.env, what's the value ofDOC_ENGINE? Keep it consistant with the code repo.
thanks!
This is my DOC-INGINE setting , I will try to compare the source code and make modifications
# - infinity (https://github.com/infiniflow/infinity)
DOC_ENGINE=${DOC_ENGINE:-elasticRagsearch}
#DOC_ENGINE=infinity
I pulled the RAGflow again and it can now run normally, but I encountered an issue when adding OLLAMA。
firstly,
the Model type , choose Chat.
the Model name , choose deepseek-r1:8b which I obtained it through the OLLAMA LIST.
the Base url, choose http://{localhost}:11434 which I obtained it through Ipconfig on Windows.
the API-Key, None
the Max Tokens, setting 1 ,take a try
the Does it support Vision? , NO
secondly,
I ran thecurl http://${IP_OF_OLLAMA_MACHINE}:11434/ once through Ensure Ollama is accessible of the Deploy a local LLM.
back : Ollama is running
but, my Ragflow run in docker and my Ollama run in the same machine,
I tested curl http://host.docker.internal:11434/ again .
back : curl: (28) Failed to connect to host.docker.internal port 11434 after 21042 ms: Could not connect to server
In short ,
I am unable to add the deepseek model from my local Ollama to the RAGflow running on Docker
I pulled the RAGflow again and it can now run normally, but I encountered an issue when adding OLLAMA。 firstly, the Model type , choose Chat. the Model name , choose deepseek-r1:8b which I obtained it through the OLLAMA LIST. the Base url, choose http://{localhost}:11434 which I obtained it through Ipconfig on Windows. the API-Key, None the Max Tokens, setting 1 ,take a try the Does it support Vision? , NO secondly, I ran the
curl http://${IP_OF_OLLAMA_MACHINE}:11434/once through Ensure Ollama is accessible of the Deploy a local LLM. back : Ollama is running but, my Ragflow run in docker and my Ollama run in the same machine, I testedcurl http://host.docker.internal:11434/again . back : curl: (28) Failed to connect to host.docker.internal port 11434 after 21042 ms: Could not connect to server In short , I am unable to add the deepseek model from my local Ollama to the RAGflow running on Docker
added:now i used the
sudo docker exec -it ragflow-server bash
root@8136b8c3e914:/ragflow# curl http://host.docker.internal:11434/
back: Ollama is running
it's OK!
but , don't add OLLAMA models in RAGflow .
Try 172.17.0.1:11434.