ollama
ollama copied to clipboard
Windows build is broken.
go build .
# github.com/jmorganca/ollama/llm
llm\llm.go:83:17: undefined: gpu.GetGPUInfo
llm\llm.go:89:9: undefined: nativeInit
llm\llm.go:92:109: undefined: extServer
llm\llm.go:94:15: undefined: newDynamicShimExtServer
llm\llm.go:101:9: undefined: newDefaultExtServer
llm\llama.go:211:24: undefined: libEmbed
llm\llama.go:218:19: undefined: libEmbed
I forgot to update the docs (I'll post a PR later today). Make sure to enable CGO. Assuming powershell - $env:CGO_ENABLED="1"
You'll also need to install MinGW so it can compile the C/C++ code with the GCC toolchain.
Note: things are still a bit rough around the edges for native windows. I'm working on some improvements to the windows build so we should wind up with a single binary that can run without any special PATH and operate either on the CPU or CUDA card natively.
Thanks Daniel. Will rebuild with your instructions.
Keep an eye on https://github.com/jmorganca/ollama/pull/1680 which is where I'm planning to post the windows native build improvements once I get them sorted out.
@dhiltgen great that #1680 is merged, does that mean I can also close this ticket?