Beta Tester of OpenCoder
I will actively support this project, and if you want I can actively test every new commit :)
I really like it as an alternative, I use Claude Code and spend too much money on API. If you need a hand just ask :)
I will try the beta version and give you updates,
Thanks for your work
@ducan-ne
In terminal: ANTHROPIC_API_KEY=******* npx opencoder@next
every request (i tested "hello") it show
prompt is too long: 204076 tokens > 200000 maximum
Could you tell me how to implement OpenRouter? It's not very clear to me unfortunately. I know that when there will be a documentation it will be clearer
@lcava000 great to see the first contribution to the project! thank you Sorry for late but I'll reply you tmr
tldr: your code hit context limit because I autoload all nessessary files to the system prompt, try change to this: export default { experimental:{ autoLoad: [] } }
btw the beta channel is not up to date as I'm currently push latest instead, will share more tmr
Sorry @ducan-ne it's not clear to me where to insert this.
I also wanted to point out that when I insert: npx opencoder@latest
It goes into error and doesn't open anything
Warning: Detected unsettled top-level await at file:///Users/****/.npm/_npx/6ee68d4008bbfff8/node_modules/opencoder/dist/cli.js:7
While if I execute: npx opencoder@next
the [email protected] screen opens correctly but clearly since I haven't inserted any API it doesn't work.
Where should the .env be inserted to use Open Router?
Hi @lcava000
The @next is behind @latest by at least 20 patches, so I think I did something wrong. Can you share your Node.js version?
Note that Opencoder requires at least "22.14.0". I have tried it on my 2 devices and it's working fine. 🤔
it's not clear to me where to insert this.
the coder.config.js file. Btw, I think I'll minimize autoLoad soon. Currently, autoLoad uses this pattern: "**/*.{js,ts,jsx,tsx,json,md,css,html,yml,yaml,py,go,rs}"
Hi 👋 You were totally right — I was using Node version 23.** and that was causing the issue. After switching to the LTS 22.14.0 version, everything works fine.
I also tried using the OpenRouter SDK, but it seems that some LLMs respond differently, which might be causing some inconsistent behavior.
This is my .env file
import { createOpenRouter } from '@openrouter/ai-sdk-provider'; import type { Config } from 'opencoder';
const openrouter = createOpenRouter({ apiKey: '*****************', });
const modelA = openrouter('openrouter/quasar-alpha');
export default { model: modelA, } satisfies Config
In one of the tests, I ran into this error:
An error occurred while processing your request:
Hi @lcava000, sorry for the late reply. I didn't see any notification. Feel free to ping me on Telegram (t.me/duc_an) or Discord (not sure how to get the profile link) for a faster response.
You were totally right — I was using Node version 23.** and that was causing the issue. After switching to the LTS 22.14.0 version, everything works fine.
This is not expected, I'll try to take a look on nodejs@23 another time.
I also tried using the OpenRouter SDK, but it seems that some LLMs respond differently, which might be causing some inconsistent behavior.
Yeah, I think so, but I haven't tried OpenRouter, so I can't say for sure. Does this model support tool calling? OpenCoder (and Claude Code) is built on top of tool calling. From my experience model like gemini 2, gpt4o, ollama qwq, or claude 3.5 support tool calling 🤔
@lcava000 I couldn't reproduce your issue with OpenRouter on my device. I just published [email protected]. Can you give it a try? I believe I've been publishing every time, but just to be sure, I published a new version.
Here is bunx opencoder@latest on my device
@lcava000 I just published [email protected] with these updates:
- Add file edit tool
- Better performance for large edits: only the last 5 messages are dynamic, while the others become static (can be cleared using Cmd+K). This is experimental, a bottleneck for launching the app, and I'm testing it rn
- Better diffing and code highlight for File write tool