continue
continue copied to clipboard
Forward all completion options to the LLM
This behaviour allows other optional parameters to be passed beyond what continue configures to allow for some server side logging. For example by filling in the user field of the OpenAI API.
You may want to merge this or look at how this works for other model providers as well? I was tinkering and had it working so thought I might as well open a PR.
@payoto we recently merged the ability to set requestOptions.body, which is another way to pass through arbitrary params. I think that would solve your problem? It also has the benefit that we don't need to separately make sure the pass-through is handled for all different providers. Let me know what you think
(and here's an example):
{
"models": [
{
"title": "OpenAI",
"provider": "openai",
"model": "MODEL_NAME",
"apiKey": "EMPTY",
"requestOptions": {
"body": { "user": "..." }
}
}
]
}
@sestinj that will solve the problem yes.
Has it been released on the pre-release yet? I didn't see where these arguments are passed in the code to OpenAI and I don't see them arriving on the server side when running with Continue v0.9.71 (pre-release), below is a log from a local endpoint:
{"messages":[{"role":"user","content":"hi"},{"role":"assistant","content":"Hello! How may I help you today? "},{"role":"user","content":"Hey"}],"model":"wizardcoder-34b","max_tokens":1024,"stream":true}
given this config:
{
"title": "local-debug",
"model": "wizardcoder-34b",
"contextLength": 2048,
"apiBase": "localhost:8000",
"requestOptions": {
"body" : {"user": "value"}
},
"provider": "openai",
"template": "none"
},
ah you are right, not yet released. will be getting this out the door today. I'll share here when it's ready
@payoto this is now in pre-release, so going to close the PR. Feel free to message me though if you find that it's not working as expected