frob

Results 840 comments of frob

https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size

What exactly doesn't work?

I agree, I have a WIP to update various bits of documentation but haven't pushed a PR yet.

It should be noted that changing the context size of a model via the Modelfile is mentioned, but in the [OpenAI doc](https://github.com/ollama/ollama/blob/main/docs/openai.md#setting-the-context-size) , not the FAQ.

[Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may help in debugging.

> but due to something ollama getting bug or memory alloc problem, idk It's too bad there's not some record of what ollama is doing that could be looked at...

Most of them are llama.cpp [options](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md#common-options) which are passed through. Some have a different name but [also map](https://github.com/ollama/ollama/issues/5946#issuecomment-2250122405) on to llama.cpp options.

[Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will aid in debugging.

``` Oct 02 11:40:28 grasshopper-tesla ollama[23633]: DEBUG [update_slots] slot context shift | n_cache_tokens=2048 n_ctx=8192 n_discard=1021 n_keep=5 n_left=2042 n_past=2047 n_system_tokens=0 slot_id=1 task_id=10 tid="140304693719040" timestamp=1727869228 ``` This model is codellama:7b-code. It's stuck...