Farid Ullah Khan
Farid Ullah Khan
local RAG and local ollama llm model(s) example code please :)
same error here
is there possibly an existing langchain latest trends agent on langchain website or templates? I think these are langchian agents.
same here, I am using ollama openhermes2.5-mistral model locally, it loads it fine and works for a bit and then it crashes with this error... openai.error.RateLimitError: You exceeded your current...
can you first test your LiteLLM with ollama2 model setup to make sure it works by itself? maybe it gives you a curl command to run to test the api...
i got same error as well now. on every run. I am using ollama locally. testing trip_planner scrupt. when I run the main.py after asking me initial questions it spits...
is there an example of using a local LLM in browser_tools.py file?
+1 for me as well please :)
here are the steps i followed before the step 11. I am on fresh Ubuntu 22.04 install. install Python 3.10.13: sudo apt update sudo apt install build-essential zlib1g-dev libncurses5-dev libgdbm-dev...