opencode
opencode copied to clipboard
Running OpenCode Desktop application with local model & censorship
Question
Hi, im using LM studio to run an inference server for my local use.
I have created this config file in my %USERPROFILE%.config\opencode folder.
{ "$schema": "https://opencode.ai/config.json", "provider": { "lmstudio": { "npm": "@ai-sdk/openai-compatible", "name": "LM Studio (Local)", "options": { "baseURL": "localhost:15432/v1" }, "models": { "mistralai/devstral-small-2-2512": { "name": "Mistral Devstral Small", "limit": { "context": 32280, "output": 8192 } } } } } }
But I got two problems.
- The model when used through opencode, it refuses to do any pen testing tasks (considered illegal by some, but im using my own model, so its overwriting the actual model default system prompt inside LM studio that removes the censorship)
- The local model list displayed inside opencode is not actually any model that I have, however, if I do select any of them, the messages do come to my actual local model (mistral). So im not sure why the real model tag is not showing inside opencode. its just causing confusion, but not important.