ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Question]:Exception(f"Not supported doc engine: {DOC_ENGINE}……)

Open Gloryone-spiderman opened this issue 11 months ago • 6 comments

Describe your problem

When I use 'docker logs -f ragflow-server', the logs display the content of Exception(f"Not supported doc engine: {DOC_ENGINE}……). How can I solve this problem?

Gloryone-spiderman avatar Feb 17 '25 13:02 Gloryone-spiderman

added: 2025-02-17T12:31:47.185980574Z raise Exception(f"Not supported doc engine: {DOC_ENGINE}") 2025-02-17T12:31:47.185982220Z Exception: Not supported doc engine: elasticRagsearch 2025-02-17T12:31:49.687340459Z 2025-02-17 20:31:49,685 INFO 6258 ragflow_server log path: /ragflow/logs/ragflow_server.log, log levels: {'peewee': 'WARNING', 'pdfminer': 'WARNING', 'root': 'INFO'} 2025-02-17T12:31:50.694437488Z 2025-02-17 20:31:50,693 INFO 6184 TextRecognizer det uses CPU 2025-02-17T12:31:50.784540385Z 2025-02-17 20:31:50,783 INFO 6184 TextRecognizer rec uses CPU 2025-02-17T12:31:50.807005816Z 2025-02-17 20:31:50,805 INFO 6184 2025-02-17T12:31:50.807032089Z ______ __ ______ __ 2025-02-17T12:31:50.807034201Z /_ /_ / / / / _____ _______ / / _____ 2025-02-17T12:31:50.807035792Z / / / __ `/ / /// / __/ | |// _ / / / / / __/ __ / / 2025-02-17T12:31:50.807037694Z / / / // ( ) ,< / /> </ / // // / // // / / 2025-02-17T12:31:50.807058989Z // _,///|| /_____//||_/_/_,/_/___//
2025-02-17T12:31:50.807060828Z 2025-02-17T12:31:50.807864596Z 2025-02-17 20:31:50,806 INFO 6184 TaskExecutor: RAGFlow version: v0.16.0-50-g6daae7f2 full 2025-02-17T12:31:50.808758519Z Traceback (most recent call last): 2025-02-17T12:31:50.808780148Z File "/ragflow/rag/svr/task_executor.py", line 737, in 2025-02-17T12:31:50.808785269Z main() 2025-02-17T12:31:50.808787543Z File "/ragflow/rag/svr/task_executor.py", line 713, in main 2025-02-17T12:31:50.808789843Z settings.init_settings() 2025-02-17T12:31:50.808791972Z File "/ragflow/api/settings.py", line 172, in init_settings 2025-02-17T12:31:50.808794324Z raise Exception(f"Not supported doc engine: {DOC_ENGINE}") 2025-02-17T12:31:50.808796463Z Exception: Not supported doc engine: elasticRagsearch 2025-02-17T12:31:54.526635262Z 2025-02-17 20:31:54,524 INFO 6258 init database on cluster mode successfully how can i solve it ? thanks

Gloryone-spiderman avatar Feb 17 '25 13:02 Gloryone-spiderman

In docker/.env, what's the value of DOC_ENGINE? Keep it consistant with the code repo.

KevinHuSh avatar Feb 18 '25 02:02 KevinHuSh

In docker/.env, what's the value of DOC_ENGINE? Keep it consistant with the code repo.

thanks! This is my DOC-INGINE setting , I will try to compare the source code and make modifications # - infinity (https://github.com/infiniflow/infinity) DOC_ENGINE=${DOC_ENGINE:-elasticRagsearch} #DOC_ENGINE=infinity

Gloryone-spiderman avatar Feb 20 '25 04:02 Gloryone-spiderman

I pulled the RAGflow again and it can now run normally, but I encountered an issue when adding OLLAMA。 firstly, the Model type , choose Chat. the Model name , choose deepseek-r1:8b which I obtained it through the OLLAMA LIST. the Base url, choose http://{localhost}:11434 which I obtained it through Ipconfig on Windows. the API-Key, None the Max Tokens, setting 1 ,take a try the Does it support Vision? , NO secondly, I ran thecurl http://${IP_OF_OLLAMA_MACHINE}:11434/ once through Ensure Ollama is accessible of the Deploy a local LLM. back : Ollama is running but, my Ragflow run in docker and my Ollama run in the same machine, I tested curl http://host.docker.internal:11434/ again . back : curl: (28) Failed to connect to host.docker.internal port 11434 after 21042 ms: Could not connect to server In short , I am unable to add the deepseek model from my local Ollama to the RAGflow running on Docker

Gloryone-spiderman avatar Feb 20 '25 08:02 Gloryone-spiderman

I pulled the RAGflow again and it can now run normally, but I encountered an issue when adding OLLAMA。 firstly, the Model type , choose Chat. the Model name , choose deepseek-r1:8b which I obtained it through the OLLAMA LIST. the Base url, choose http://{localhost}:11434 which I obtained it through Ipconfig on Windows. the API-Key, None the Max Tokens, setting 1 ,take a try the Does it support Vision? , NO secondly, I ran thecurl http://${IP_OF_OLLAMA_MACHINE}:11434/ once through Ensure Ollama is accessible of the Deploy a local LLM. back : Ollama is running but, my Ragflow run in docker and my Ollama run in the same machine, I tested curl http://host.docker.internal:11434/ again . back : curl: (28) Failed to connect to host.docker.internal port 11434 after 21042 ms: Could not connect to server In short , I am unable to add the deepseek model from my local Ollama to the RAGflow running on Docker

added:now i used the sudo docker exec -it ragflow-server bash root@8136b8c3e914:/ragflow# curl http://host.docker.internal:11434/ back: Ollama is running it's OK! but , don't add OLLAMA models in RAGflow .

Gloryone-spiderman avatar Feb 20 '25 08:02 Gloryone-spiderman

Try 172.17.0.1:11434.

KevinHuSh avatar Feb 21 '25 06:02 KevinHuSh