Results 1 comments of jasonliu

> > 你好,按照教程用docker已经在服务器部署了chatollama,服务器也安装了ollama服务,下载了几个模型。也可以正常运行。 tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 2216/ollama 但是按照教程设置了Ollama server to [http://host.docker.internal:11434,刷新model无法显示已下载的model,后台报错[nuxt]](http://host.docker.internal:11434%EF%BC%8C%E5%88%B7%E6%96%B0model%E6%97%A0%E6%B3%95%E6%98%BE%E7%A4%BA%E5%B7%B2%E4%B8%8B%E8%BD%BD%E7%9A%84model%EF%BC%8C%E5%90%8E%E5%8F%B0%E6%8A%A5%E9%94%99%5Bnuxt%5D) [request error] [unhandled] [500] fetch failed chatollama-1 | at Object.fetch (node:internal/deps/undici/undici:11576:11) chatollama-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5),请指教...