Alias4D
                                            Alias4D
                                        
                                    error connect to local ollama server    any fix , please
Problem solved for nextChat Just set variables Ollama host to 0.0.0.0 Ollama origins to * Set openai end point of 127.0.0.1:11434 Model name to same name of ollama list name
Thanks for reply
I mean serve ollama on Android by application or termux not client application
Yes, I need to expose ollama open api over lan network to use it with other open api clients on windows or Android like PandasAi, maid, next gpt chat web...
Thanks for reply 🙏🌹 Problem solved 😊
Thanks for your time 🙏 I try steps above, but Nitro.exe Application Error The application was unable to start correctly (0xc0000142). Click OK to close the application.
{ "notify": true, "run_mode": "cpu", "nvidia_driver": { "exist": false, "version": "" }, "cuda": { "exist": false, "version": "" }, "gpus": [], "gpu_highest_vram": "" }
I try reinstall ollama but same error message 😌 App.log below 👇 time=2024-02-25T17:50:52.780+03:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 2 - respawning" time=2024-02-25T17:50:53.293+03:00 level=ERROR source=server.go:116 msg="failed to restart...
Thanks, problem fixed