Dhiren Mathur
Dhiren Mathur
Hey @dfordp , Thanks for attempting this! In the interest of time, I'm assigning this to @PRANJALRANA11 since there have been no follow up messages in ~10 days. We will...
Hi We will be taking this issue up internally in the interest of time, thanks for the offers folks, we are opening new issues soon that you will be able...
Hey @WTK , the parsing happens on the celery worker, can you please confirm that the celery worker is up, you should see a log like `[email protected] ready.`
That is weird @WTK, that agent usually works well without node ids passed to it. >My guess is that this node_ids is holding some pieces of "knowledge graph" but they're...
Hey @WTK, weird - I'm getting an empty list at that point. But you're right, we should atleast have a nullcheck there, we'd appreciate a PR if you could raise...
Hi @weekendli ! Thanks for using potpie! You can use any provider supported by litellm by setting the following variables in your env file: ``` LLM_PROVIDER=provider_name LLM_API_KEY=provider_key LOW_REASONING_MODEL=provider/model_name HIGH_REASONING_MODEL=provider/model name...
@weekendli may I close this if there are no further question?
@weekendli we recently added support for using LLM_API_BASE env variable as part of supporting azure openai, but there is a simple check for azure in provider_service that you would have...
Hi @jai2033shankar thanks for trying out potpie! Can you please verify that your version of python is 3.10 as that is a requirement for blar.
Please try with Python 3.10, I just tried `pip install --no-cache-dir blar-graph==1.1.6` and pip was able to resolve the version on Python 3.10