MonkeyCode
MonkeyCode copied to clipboard
Feature request: `OpenAI compatible` and `LM Studio` support
Hi, Are there plans to add OpenAI compatible and LM Studio support?
All our AI models are deployed and run locally using OpenAI compatible or LM Studio.
On the model configuration page, select the ollama option, enter an OpenAI compatible URL address (e.g., http://127.0.0.1:8000/v1), and then choose the corresponding model. I used vLLM, and it has been verified to work through practical testing.