Jim Scardelis

Results 16 comments of Jim Scardelis

The API doesn't take *modelfiles* -- it uses *models*. Before you can use a model with the API, you need to first either create the actual model, e.g., `ollama create...

To provide a datapoint; it seems to work fine on my Intel Macbook (32GB i9)

Hi, it would be better to ask questions like this in the [discord](https://discord.gg/bduDybW3). It looks like your docker run command is constructed incorrectly. Did it not throw an error? The...

If Ollama is intended to be used as a *local* LLM system, then queuing requests and processing them serially is appropriate. Enabling concurrent processing, e.g., a "server" scenario would significantly...

What is the use case for this? Is it causing a problem? Preventing offloading seems to me to be not an optimal solution, as it could easily cause resource starvation...

FWIW, I wouldn't expect a tool like this to be self-updating. I'd want that to be managed by a package manager, such as homebrew on the Mac (see issue #3...