Kim Hallberg
Kim Hallberg
> Is possible to use this Bert model in ollama? > > jinaai/jina-embeddings-v2-base-es The Jina team has added their models to the registry, you can find them on their user...
> this model can now easily featured in ollama Ollama needs to update its version of llama.cpp first, maybe https://github.com/ollama/ollama/pull/5475 could be updated to include the OpenELM PR, @jmorganca?
An awesome idea might steal it for my Nuxt site. 😄 👍
@TawsifKarim I know this is really late, but the issue is your `ListenAndServe` call, you shouldn't specify localhost. That is a loopback interface and is not exposed externally. You can...
You need to pass in an object to `ollama.pull`. Like you do for most functions. You can see that in the documentation in the README for `pull` - https://github.com/ollama/ollama-js?tab=readme-ov-file#pull ```js...
`ollama` is not framework dependent, it's a standalone package and can be used with your framework of choice.