blakkd

Results 39 comments of blakkd

Hi! I also have same question: ``` ~/reader/backend/functions ❯❯❯ npm start > start > npm run shell > shell > npm run build && firebase functions:shell > build > node...

> ### Describe your problem > Hi Team, > > We're working on integrating the Ollama model deployed locally. While querying the local Ollama gives us the correct responses, we're...

or LLM_OLLAMA_BASE_URL as http://127.0.0.1:11434

I personally got mad for almost a hour before trying this and inference seems to use this variable too

I don't know at all if it's relevant for you as I don't know your use case. But for anyone in my case: I wanted to prevent the embedding model...

Ok, deleting my comment as it's not relevant as I see ollama doesn't actually load the embedding models into the gpu! Even if you put num_gpu 10 or so. So...

Oh I see. But then why you want to set the keep_alive? It's meant do exactly the opposite :thinking: However, maybe setting a duration value < 5min (from last inference)...

(Just to inform it seems the fact ollama wasn't able to loade the embedding models on the GPU was a bug. I don't face it anymore on 0.3.8. I think...

I guess you judged this info valuable enough to take action and share it, and this is always great! But here is my vue: Even if it came from a...

Wait are you kidding? I looked a bit at your CV, your "website", your video footage filming a monitor from a tripod (I don't know what u smoking) where we...