opencode icon indicating copy to clipboard operation
opencode copied to clipboard

Running OpenCode Desktop application with local model & censorship

Open Noorababi opened this issue 2 weeks ago • 1 comments

Question

Hi, im using LM studio to run an inference server for my local use.

I have created this config file in my %USERPROFILE%.config\opencode folder.

{ "$schema": "https://opencode.ai/config.json", "provider": { "lmstudio": { "npm": "@ai-sdk/openai-compatible", "name": "LM Studio (Local)", "options": { "baseURL": "localhost:15432/v1" }, "models": { "mistralai/devstral-small-2-2512": { "name": "Mistral Devstral Small", "limit": { "context": 32280, "output": 8192 } } } } } }

But I got two problems.

  1. The model when used through opencode, it refuses to do any pen testing tasks (considered illegal by some, but im using my own model, so its overwriting the actual model default system prompt inside LM studio that removes the censorship)
Image
  1. The local model list displayed inside opencode is not actually any model that I have, however, if I do select any of them, the messages do come to my actual local model (mistral). So im not sure why the real model tag is not showing inside opencode. its just causing confusion, but not important.

Noorababi avatar Jan 03 '26 22:01 Noorababi