cursor
cursor copied to clipboard
Connect Cursor to a local hosted LLM on VLLM etc
Describe the solution you'd like
I would like to connect cursor to a local hosted quantised LLM running on VLLM or the like. VLLM and similar projects all support an OpenAI like API with chat completions. Should be reasonable easy to implement. At the moment you are only ably to point to an Azure OpenAi endpoint. Go one step further here and allow us to point to any Chat Completions like API eg "http://localhost:1234/v1"
I noticed that the issue is probably when using http not https
+1 for vLLM support