opencode
opencode copied to clipboard
[FEATURE]:allow setting the context size for local models
Feature hasn't been suggested before.
- [x] I have verified this feature I'm about to request hasn't been suggested before.
Describe the enhancement you want to request
An example of getting a response and setting the context length.
curl http://localhost:11434/api/generate -d '{ "model": "llama3.2", "prompt": "Why is the sky blue?", "options": { "num_ctx": 4096 } }'
It would be good to be able to specify either in the parameter file the num_ctx to send to the model or allow an overwrite of the context. if using ollama then send the option for num_ctx.