Michael Yang
Michael Yang
Is it a cloud platform, such as AWS, GCP, or self-hosted? If self-hosted, what hypervisor? What kind of disk is attached to the VM? Please provide as much detail as...
@UeberTimei port 53 indicates there's a problem with DNS. Make sure DNS is set up correctly in this environment
> curl: (6) Could not resolve host: registry.ollama.ai This indicates there's a problem with DNS. Please ensure the DNS is set up correctly for the WSL context
#2719 is unrelated. Furthermore, this issue has gotten off topic. The original issue deals with fetching the install script which has been resolved. I'm now going to close this issue....
Commenting here to say we're aware of MLX. I've been working on a prototype but I can't give an ETA at for MLX support at this time
@robertsmaoui I'm not sure what issues you're experiencing. The commands you provided should work as you'd expect. ``` $ docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama a28f0d7934d3c96066a70937fc1b99d280b37653b423d6e45e31f82ce0951087...
As others have mentioned, ollama serves on localhost by default. If you want to change this, set `OLLAMA_HOST`. Please see the [FAQ](https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network) for details
There's no specific problem or ask here so I'm going to close this issue
> We should update the existing routes that do have model and name to act similarly This PR already does that unless I'm missing something: ```go var model string if...
Maybe worth noting this has always been the case for GGUF models