Kim Hallberg

Results 91 comments of Kim Hallberg

> Is possible to use this Bert model in ollama? > > jinaai/jina-embeddings-v2-base-es The Jina team has added their models to the registry, you can find them on their user...

> this model can now easily featured in ollama Ollama needs to update its version of llama.cpp first, maybe https://github.com/ollama/ollama/pull/5475 could be updated to include the OpenELM PR, @jmorganca?

An awesome idea might steal it for my Nuxt site. 😄 👍

@TawsifKarim I know this is really late, but the issue is your `ListenAndServe` call, you shouldn't specify localhost. That is a loopback interface and is not exposed externally. You can...

You need to pass in an object to `ollama.pull`. Like you do for most functions. You can see that in the documentation in the README for `pull` - https://github.com/ollama/ollama-js?tab=readme-ov-file#pull ```js...

`ollama` is not framework dependent, it's a standalone package and can be used with your framework of choice.

> Thank you for pointing out my spelling mistake If your issue is resolved please consider closing the issue.

PHP runtimes error out with: `The job exceeded the maximum log length, and has been terminated.` Build failure was not related to this PR.

> Thanks Rick If your issue is resolved please consider closing it.

Ollama doesn't have tools only tool support. As the examples show, it's up to you to build the tools and implementation of those tools that the LLM will ultimately use.