context
context copied to clipboard
Add usage of any OpenAI endpoint
I run my LLMs on text-generation-webui which also provides an openai-compatible endpoint, it would be great if we could provide our own generation parameters since it's unhappy with the max_length being passed as -1 when trying to use --local
If this is already in the pipeline, great! If not I may try to investigate and open a PR with potential fixes