aider icon indicating copy to clipboard operation
aider copied to clipboard

Unable to use lm_studio model with aider

Open planetf1 opened this issue 2 months ago • 2 comments

Issue

Trying to use a local qwen3-coder model with aider on macOS 26 (arm)

No models match "{'qwen': {'model': 'lm_studio/qwen3-coder-30b-a3b-instruct-mlx', 'openai_api_base': 'http://localhost:1234/v1', 'openai_api_key':

The config file contains:

models:
  qwen:
    model: lm_studio/qwen3-coder-30b-a3b-instruct-mlx
    openai_api_base: http://localhost:1234/v1
    openai_api_key: unused

I can access the model ok:

 curl http://localhost:1234/v1/models | jq '.data[].id'                                                              <<<

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  2824  100  2824    0     0   748k      0 --:--:-- --:--:-- --:--:--  919k
"qwen/qwen3-1.7b"
"qwen/qwen3-30b-a3b-2507"
"microsoft/phi-4-reasoning-plus"
"microsoft/phi-4"
"text-embedding-nomic-embed-text-v1.5"
"qwen3-30b-a3b-thinking-2507"
"qwen3-4b-thinking-2507-mlx@6bit"
"qwen/qwen3-4b-thinking-2507"
"qwen3-coder-30b-a3b-instruct-480b-distill-v2"
"phi-4-mini-instruct"
"microsoft/phi-4-mini-reasoning"
"qwen2.5-coder-14b-qiskit"
"text-embedding-nomic-embed-text-v2"
"granite-3.3-8b-qiskit"
"qwen/qwen2.5-coder-14b"
"granite-3.3-8b-instruct"
"granite-3.3-2b-instruct"
"granite-4.0-tiny-preview"
"text-embedding-granite-embedding-107m-multilingual"
"text-embedding-granite-embedding-278m-multilingual"
"mistralai/devstral-small-2507"
"qwen3-coder-30b-a3b-instruct-mlx"
"openai/gpt-oss-20b"
➜  qiskit_studio git:(onepl2) ✗ curl http://localhost:1234/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "qwen3-coder-30b-a3b-instruct-mlx",
    "messages": [
      {
        "role": "user",
        "content": "Tell me a joke about quantum computing."
      }
    ]
  }'
{
  "id": "chatcmpl-y6h6hguelf9asabesgq4u",
  "object": "chat.completion",
  "created": 1757542665,
  "model": "qwen3-coder-30b-a3b-instruct-mlx",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "Why did the quantum computer break up with its partner?\n\nBecause it couldn't make a definite choice! \n\nThe quantum computer kept saying \"I'm in superposition of love and indifference\" but never actually picked a relationship status. It was always \"entangled\" in uncertainty about whether to commit or not, never able to collapse its wave function into a clear decision.\n\nThe partner eventually said \"I can't handle your quantum entanglement with other people!\" and left for the classical world where at least you know if someone is available or not.\n\nThe quantum computer was left alone, still in superposition: \"I'm both happy and sad that we're not together, but I can't tell you which one is real until you observe my state.\"",
        "reasoning_content": "",
        "tool_calls": []
      },
      "logprobs": null,
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 16,
    "completion_tokens": 151,
    "total_tokens": 167
  },
  "stats": {},
  "system_fingerprint": "qwen3-coder-30b-a3b-instruct-mlx"
}%

Verbose output:

➜  qiskit_studio git:(onepl2) ✗ cat /tmp/x
Config files search order, if no --config:
  - /Users/jonesn/src/testproject/.aider.conf.yml
  - /Users/jonesn/.aider.conf.yml (exists)
────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
Too soon to check version: 0.5 hours
Command Line Args:   --model lm_studio/qwen3-coder-30b-a3b-instruct-mlx -v
Config File (/Users/jonesn/.aider.conf.yml):
  models:            {'qwen': {'model': 'lm_studio/qwen3-coder-30b-a3b-instruct-mlx', 'openai_api_base': 'http://localhost:1234/v1', 'openai_api_key':
'unused'}}
Defaults:
  --set-env:         []
  --api-key:         []
  --model-settings-file:.aider.model.settings.yml
  --model-metadata-file:.aider.model.metadata.json
  --cache-keepalive-pings:0
  --map-refresh:     auto
  --map-multiplier-no-files:2
  --input-history-file:/Users/jonesn/src/testproject/.aider.input.history
  --chat-history-file:/Users/jonesn/src/testproject/.aider.chat.history.md
  --user-input-color:#00cc00
  --tool-error-color:#FF2222
  --tool-warning-color:#FFA500
  --assistant-output-color:#0088ff
  --code-theme:      default
  --aiderignore:     /Users/jonesn/src/testproject/.aiderignore
  --lint-cmd:        []
  --test-cmd:        []
  --voice-format:    wav
  --voice-language:  en
  --encoding:        utf-8
  --line-endings:    platform
  --env-file:        /Users/jonesn/src/testproject/.env

Option settings:
  - 35turbo: False
  - 4: False
  - 4_turbo: False
  - 4o: False
  - add_gitignore_files: False
  - aiderignore: /Users/jonesn/src/testproject/.aiderignore
  - alias: None
  - analytics: None
  - analytics_disable: False
  - analytics_log: None
  - analytics_posthog_host: None
  - analytics_posthog_project_api_key: None
  - anthropic_api_key: None
  - api_key: []
  - apply: None
  - apply_clipboard_edits: False
  - assistant_output_color: #0088ff
  - attribute_author: None
  - attribute_co_authored_by: True
  - attribute_commit_message_author: False
  - attribute_commit_message_committer: False
  - attribute_committer: None
  - auto_accept_architect: True
  - auto_commits: True
  - auto_lint: True
  - auto_test: False
  - cache_keepalive_pings: 0
  - cache_prompts: False
  - chat_history_file: /Users/jonesn/src/testproject/.aider.chat.history.md
  - chat_language: None
  - check_model_accepts_settings: True
  - check_update: True
  - code_theme: default
  - commit: False
  - commit_language: None
  - commit_prompt: None
  - completion_menu_bg_color: None
  - completion_menu_color: None
  - completion_menu_current_bg_color: None
  - completion_menu_current_color: None
  - config: None
  - copy_paste: False
  - dark_mode: False
  - deepseek: False
  - detect_urls: True
  - dirty_commits: True
  - disable_playwright: False
  - dry_run: False
  - edit_format: None
  - editor: None
  - editor_edit_format: None
  - editor_model: None
  - encoding: utf-8
  - env_file: /Users/jonesn/src/testproject/.env
  - exit: False
  - fancy_input: True
  - file: None
  - files: []
  - git: True
  - git_commit_verify: False
  - gitignore: True
  - gui: False
  - haiku: False
  - input_history_file: /Users/jonesn/src/testproject/.aider.input.history
  - install_main_branch: False
  - just_check_update: False
  - light_mode: False
  - line_endings: platform
  - lint: False
  - lint_cmd: []
  - list_models: {'qwen': {'model': 'lm_studio/qwen3-coder-30b-a3b-instruct-mlx', 'openai_api_base': 'http://localhost:1234/v1', 'openai_api_key':
'unused'}}
  - llm_history_file: None
  - load: None
  - map_multiplier_no_files: 2
  - map_refresh: auto
  - map_tokens: None
  - max_chat_history_tokens: None
  - message: None
  - message_file: None
  - mini: False
  - model: lm_studio/qwen3-coder-30b-a3b-instruct-mlx
  - model_metadata_file: .aider.model.metadata.json
  - model_settings_file: .aider.model.settings.yml
  - multiline: False
  - notifications: False
  - notifications_command: None
  - o1_mini: False
  - o1_preview: False
  - openai_api_base: None
  - openai_api_deployment_id: None
  - openai_api_key: None
  - openai_api_type: None
  - openai_api_version: None
  - openai_organization_id: None
  - opus: False
  - pretty: True
  - read: None
  - reasoning_effort: None
  - restore_chat_history: False
  - set_env: []
  - shell_completions: None
  - show_diffs: False
  - show_model_warnings: True
  - show_prompts: False
  - show_release_notes: None
  - show_repo_map: False
  - skip_sanity_check_repo: False
  - sonnet: False
  - stream: True
  - subtree_only: False
  - suggest_shell_commands: True
  - test: False
  - test_cmd: []
  - thinking_tokens: None
  - timeout: None
  - tool_error_color: #FF2222
  - tool_output_color: None
  - tool_warning_color: #FFA500
  - upgrade: False
  - user_input_color: #00cc00
  - verbose: True
  - verify_ssl: True
  - vim: False
  - voice_format: wav
  - voice_input_device: None
  - voice_language: en
  - watch_files: False
  - weak_model: None
  - yes_always: None

Checking imports for version 0.86.0 and executable /opt/homebrew/Cellar/aider/0.86.0/libexec/bin/python
Installs file: /Users/jonesn/.aider/installs.json
Installs file exists and loaded
Not first run, loading imports in background thread
No model settings files loaded
Searched for model settings files:
  - /Users/jonesn/.aider.model.settings.yml
  - /Users/jonesn/src/testproject/.aider.model.settings.yml
Loaded model metadata from:
  - /opt/homebrew/Cellar/aider/0.86.0/libexec/lib/python3.12/site-packages/aider/resources/model-metadata.json
No models match "{'qwen': {'model': 'lm_studio/qwen3-coder-30b-a3b-instruct-mlx', 'openai_api_base': 'http://localhost:1234/v1', 'openai_api_key':
'unused'}}".
➜  qiskit_studio git:(onepl2) ✗

I also get the same if I set the lmstudio environment vars as per https://aider.chat/docs/llms/lm-studio.html

First time user, so I may have missed something obvious.

Version and model info

Aider 0.86.0

planetf1 avatar Sep 10 '25 22:09 planetf1

Here's how I got mlx-lm working. Perhaps you can use that instead.

https://github.com/Aider-AI/aider/issues/4526

mattharrison avatar Sep 21 '25 06:09 mattharrison

models:
  qwen:
    model: lm_studio/qwen3-coder-30b-a3b-instruct-mlx
    openai_api_base: http://localhost:1234/v1
    openai_api_key: unused

This doesn't look like valid syntax.

Please read: https://aider.chat/docs/llms/lm-studio.html

gcp avatar Oct 04 '25 09:10 gcp