dexter
dexter copied to clipboard
LM Studio Support
As a fan of local models and this cli, I added support for an application called LM Studio that follows local endpoints similar to OpenAI's.
LinkedIn: https://www.linkedin.com/in/lrdoc/
Example of Addition https://github.com/user-attachments/assets/f69ebb08-2c99-480b-99a1-cd65f26d9d67
GitHub Generated: This pull request adds support for using LM Studio as a local LLM provider, alongside Ollama, and ensures users can select and use LM Studio models throughout the application. The changes update documentation, environment configuration, model selection logic, and provider handling to fully integrate LM Studio.
LM Studio integration:
- Added LM Studio configuration to
env.exampleand documented its usage inREADME.mdfor local LLM support. [1] [2] [3] - Implemented
getLMStudioModelsutility to fetch available models from a running LM Studio instance, following the OpenAI-compatible API. - Updated model selection UI and CLI logic to list, select, and handle LM Studio models similarly to Ollama, including prefix handling and user instructions. [1] [2] [3] [4] [5] [6] [7] [8]
Provider and model handling:
- Added LM Studio as a recognized provider in environment and model logic, ensuring correct prefixing, skipping API key flow, and base URL configuration. [1] [2]
Agent and tool execution updates:
- Modified
ToolExecutorand agent orchestrator to use the currently selected model (including local providers) for tool selection, instead of a hardcoded model. [1] [2] [3] [4]