Matt Williams
Matt Williams
Thanks so much for this issue. We are adding a community page soon to the docs and this might be a better place to add this.
The memory will be release about 5 minutes after the last time you use it
It's automatic at this time. But we are looking into other options.
That’s great to hear. There is an interesting PR using environment variables that may solve this for some folks.
How much ram does your machine have? You mentioned vram.
What OS are you running? How did you install it?
Thanks for sharing this. We are looking into it. There is a release coming soon which is 0.1.14, but I don't think that will be in there. Will let you...
Can you try repulling the models being used. We updated most of them in the last few weeks to address issues like this.
How did you install ollama?
hi @Pulkit077 I don't think Langchain has any restrictions on which models can be used with Ollama. So yes, you should be good there