LM Studio Support
LM Studio is a desktop app for developing and experimenting with LLMs locally on your computer.
It has a local server with OpenAI-compatible API, making it a drop-in replacement for OpenAI SDKs.
Example:
from openai import OpenAI
client = OpenAI(base_url="http://localhost:1234/v1", api_key="not-needed")
response = client.chat.completions.create(model="llama-3-8b", messages=[{"role": "user", "content": "Hi"}])
https://lmstudio.ai/docs/app/api/endpoints/openai https://lmstudio.ai/docs/app/api/endpoints/rest
For anyone who needs to use lmstudio now, here is an example showing how to get this working with the calculator example:
https://github.com/grayfallstown/koog-with-lmstudio-and-local-models
The example uses Phi-4-Mini-Instruct as model and I really recommend using something larger, you will see why.
Listing available models does not work due to Field 'created' is required for type with serial name 'ai.koog.prompt.executor.clients.openai.models.OpenAIModel'
LMStudio does not return the created field. LMStudio output for GET to /v1/models:
{
"data": [
{
"id": "qwen/qwen2.5-vl-7b",
"object": "model",
"owned_by": "organization_owner"
},
{
"id": "qwen/qwen3-vl-8b",
"object": "model",
"owned_by": "organization_owner"
},
{
"id": "google/gemma-3-12b",
"object": "model",
"owned_by": "organization_owner"
},
{
"id": "text-embedding-nomic-embed-text-v1.5",
"object": "model",
"owned_by": "organization_owner"
}
],
"object": "list"
}