stevenhe1988
stevenhe1988
由于公司Python版本的特定限制,不能使用Python3.10,使用Python3.11+Ubuntu22.04+cuda12.1解析时会报错 Segmentation fault (core dumped) 请问不能使用Python3.11而必须使用3.10的制约是什么?是否有计划支持3.11?或我如何配置才可以在3.11上使用? 感谢
### Describe your problem 平台:Windows 11 + RAGFlow 0.17.0 Full 问题:使用命令行+Ollama 推理时,模型被加载到GPU使用,但通过RAGFlow 调用 Ollama模型解析文档时,模型被加载到CPU+内存。OCR解析文档的模型使用的是 minicpm-v:latest 模型文件大小 5.5GB 模型被加载到CPU+内存而不是GPU 会导致稍大的文件(比如30多MB的PDF文件)解析时因内存不足而entrypoint.sh 被kill,而且目前WINDOWS 11上的WSL2最多只能设置为系统内存的50%,无法调大docker可用内存  ------------------------------------------ [Update] 使用ollama模型提取知识图谱时可以将模型加载到GPU使用,此时使用的模型是deepseek-r1:7b-qwen-distill-q4_K_M 模型文件大小4.7GB 
### Describe your problem I'm using RAGFlow 0.17 on Windows, and I've trying to add Azure-OpenAI as default LLM. according to documents (https://ragflow.io/docs/dev/configurations), it support 6 factory type but not...
### Self Checks - [x] I have searched for existing issues [search for existing issues](https://github.com/infiniflow/ragflow/issues), including closed ones. - [x] I confirm that I am using English to submit this...
Hello Thanks for building zerox, it's quite easy to use. And I am currently using a customized prompt when calling GPT4o API, is there any place I can set seed/TOP...