Nikos Tsarmpopoulos
Nikos Tsarmpopoulos
I changed the contents of the PROMPTS["DEFAULT_ENTITY_TYPES"] directly in the prompts.py file. This should do the job, but it did not work. I have to look deer into the code...
To test my Azure account configuration, I deployed graphrag using the second method, from [here](https://microsoft.github.io/graphrag/get_started/). I manually deployed the ai models in azure openai and configured the graphrag settings. All...
Update: The issue was caused by the following setting of deploy.sh: GRAPHRAG_LLM_MODEL_QUOTA="80" The default quota for the LLM resources I had reserved on Azure was reportedly 8, so changing the...
> I noticed that you are using a local Ollama as your LLM. LightRAG's default context window size is 32K, which is much larger than Ollama's default of 2K. This...
Thanks again! > 1. When multiple files are uploaded, once the first file upload is complete, the server immediately initiates a processing job, resulting in subsequent files being queued until...
In its own conda environment: `pip install chatterbox-tts` .... Installing collected packages: numpy ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This...
Hello, have you tested with a larger LLM model? Do you still get invalid source ids? Drift Search doesn't work for me on GraphRAG v2.3, I will retry with v2.1.