claude-task-master
claude-task-master copied to clipboard
Support LMStudio for Local AI Models
I tend to use LMStudio instead of Ollama since its much better although less popular than Ollama.
Since Vercel AI SDK is used, supporting it should be trivial since both work the same way.
I think the better option would be to add an api_base param to the openai provider so you can just use LMStudios local inference endpoint but you can also use any other local provider that supports the openai standard, which is almost every local provider