Michael Yang

Results 116 comments of Michael Yang

Can you describe your build environment, specifically distro and its version, so we can try to reproduce? I'm curious. Is there a specific reason you're interested in 0.1.6? There's been...

On initial investigation, this appears to be a bug though it's unclear if it's a bug in Ollama or llama.cpp. I can reproduce this on Linux: ``` $ curl -X...

What are the permissions and ownership of `OLLAMA_MODELS`? The ollama process is run as `ollama/ollama`. If the models path does not allow `rwx` for `ollama`, the process will fail to...

Can you share the model? Without knowing the specifics, there's not much we can do to help troubleshoot

The progress bar has been updated to display decimals when necessary

I haven't been able to reproduce this. Ollama behaves as expected. Here is the test I ran: 1. Create a VM with multiple, small physical volumes 2. Create a logical...

It seems likely the CLI also requires the proxy settings, not just the server. This would explain why the API, presumably with curl, works while the CLI does not. This...

Please see #2168 which describes a possible failure scenarios when using `HTTP_PROXY`. If that doesn't represent your problem, please describe your issue in detail and we can determine if there's...

ollama without OLLAMA_HOST use 127.0.0.1:11434 by default so unless that's set to 0.0.0.0:63321 for both server and client, it'll use 127.0.0.1:11434 regardless of python environment