Kim Hallberg
Kim Hallberg
@AprilCat seems like a connection issue, might be the same issue as https://github.com/ollama/ollama/issues/4042 or any other networking issue. Check for [issues labelled networking](https://github.com/ollama/ollama/issues?q=is%3Aopen+is%3Aissue+label%3Anetworking) or create your own bug issue.
Afaik the [Qwen 1.5](https://ollama.com/library/qwen) versions on Ollama are Qwen 2, look at `family` in the `model` layer and it says `qwen2`, might be that Ollama doesn't support the AWQ versions....
Python 3.12 isn't working just yet with llama.cpp's Python tools, Python 3.11 works fine though. 👍
> the model jina/jina-embeddings-v2-base-de is now [downloadable](https://ollama.com/jina/jina-embeddings-v2-base-de) from the ollama model list. It uses a jina-bert-v2 architecture. Unfortunately, this architecture is not supported in ollama v.0.1.38: The PR to update...
Yes, you can add your own models or open-source models, as long as the architecture is supported by Ollama and llama.cpp and the model creator allows it. Ollama has a...
@jmorganca superseded this PR with https://github.com/ollama/ollama/pull/4414.
The Model is already available on Ollama: https://ollama.com/bramvanroy/geitje-7b-ultra
Yes, Bloom's `BloomForCausalLM` architecture is supported by llama.cpp. There are 2 1B Bloom models on Ollama already which were added by a community member: - https://ollama.com/gouranshitera/bloom-1b1 - https://ollama.com/gouranshitera/bloomz-1b7 You can...
Ollama doesn't currently support Jina Embeddings v2, it should be supported after https://github.com/ollama/ollama/pull/4414 gets merged, so you'd likely have to wait for the new Ollama release or build from source...
This seems like an issue that should be created on https://github.com/ollama/ollama-js instead.