Daniel Hiltgen

Results 1265 comments of Daniel Hiltgen

Unicode characters in the model directory should be working now. Please give [0.1.33](https://github.com/ollama/ollama/releases) (available in pre-release) a try and let us know.

With PR #3418 I'm setting us up to be able to support mixed GPU types. I'm not sure if this will be possible in docker or not, but on the...

As to the out-of-memory error, can you share the server log? We report some memory details before we load that may help us narrow down where the prediction logic is...

Ah, I need to clarify something. We wont be able to support splitting a single model across GPUs from different vendors unfortunately with our current LLM runner architecture. You could...

Yes, those should all work on main now that #3418 has merged, as long as you "opt in" to the new concurrency support. (see the PR for instructions)

Looks like a dup of #3504

This sounds like it may be an NVIDIA driver bug, or possibly hardware fault. Did you get a BSOD? Did it report which driver was hung?

Hit a known compile bug upstream on arm - backing off to a prior release...

Moved back to `b1842` and the build works. Tested on a handful of systems. (mac, linux, cuda/rocm)