claude-task-master icon indicating copy to clipboard operation
claude-task-master copied to clipboard

Support LMStudio for Local AI Models

Open deadcoder0904 opened this issue 6 months ago • 1 comments

I tend to use LMStudio instead of Ollama since its much better although less popular than Ollama.

Since Vercel AI SDK is used, supporting it should be trivial since both work the same way.

deadcoder0904 avatar Jun 10 '25 16:06 deadcoder0904

I think the better option would be to add an api_base param to the openai provider so you can just use LMStudios local inference endpoint but you can also use any other local provider that supports the openai standard, which is almost every local provider

yuisheaven avatar Jun 16 '25 15:06 yuisheaven