opencode icon indicating copy to clipboard operation
opencode copied to clipboard

bug: DecimalError

Open Locbac opened this issue 2 months ago • 2 comments

Description

No matter which model or message I type I end up getting this error: Error: [DecimalError] Invalid argument: [object Object]. I originally thought it was cause my locale is french, and in french numbers are formatted as 35,33, but in english they're 35.33. However changing my locale to english and rebooting didn't fix it, neither did running LC_ALL=en_US.UTF-8 opencode in the terminal fix it.

OpenCode version

0.3.58

Steps to reproduce

  1. Launch opencode: opencode
  2. Type any message with any model test
  3. See the error:
test
amon (10:31 AM)

I see you've entered "test". Are you looking to run existing tests, or do you have a specific task you'd like me to test or implement?

If you want to run tests, I need to know the command, which I can try to infer from the project structure (e.g., nix-build, pytest, npm test).

If you have a task, please describe it. If you'd like to see my current task list, I can use the todoread tool.
gemini-flash-lite-latest (10:31 AM)

Error: [DecimalError] Invalid argument: [object Object]

For small outputs it lets the model output everything. For any work to be done, like asking it to edit a file, it cuts off the output.

refactor the inspiron host to make things more clear
amon (10:32 AM)

I will start by listing the files in the hosts/inspiron directory to understand the current structure and identify what needs refactoring for clarity.

[tool_call:list for path '/hosts/inspiron']
gemini-flash-lite-latest (10:32 AM)

Error: [DecimalError] Invalid argument: [object Object]

See that it cut off the model.

Screenshot and/or share link

https://opencode.ai/s/C1aqyaC6

Operating System

MacOS 15.6.1 (24G90)

Terminal

Wave

Locbac avatar Dec 25 '25 15:12 Locbac

This issue might be a duplicate of existing issues. Please check:

  • #6010: [DecimalError] Invalid argument: [object Object] - Similar DecimalError occurring on Opus 4.5
  • #5584: LLM Output cut off / stops midway in UI but appears fine in transcript - Related to model output being cut off

Feel free to ignore if none of these address your specific case.

github-actions[bot] avatar Dec 25 '25 15:12 github-actions[bot]

If you are on version 0.3 can you upgrade? This is an extremely old version

rekram1-node avatar Dec 25 '25 15:12 rekram1-node

Ok I updated and it works, for some odd reason, I had it set to unstable with nix-darwin but it didn't work until I did nix flake update?

  environment.systemPackages = with pkgs; [
     ...
    (nixpkgs-unstable.legacyPackages.${pkgs.system}.opencode)
    ...
  ];

Likely an issue on my side with how my packages were managed

Locbac avatar Dec 26 '25 13:12 Locbac

issue still exists in

❯ opencode --version
0.15.14

I got it from nixos-unstable

jakob1379 avatar Jan 13 '26 16:01 jakob1379

@jakob1379 You are on a very old version, please consider updating to latest

rekram1-node avatar Jan 13 '26 16:01 rekram1-node

@jakob1379 You are on a very old version, please consider updating to latest

jeez - thought I was past forgetting to do "nix flake update".... lol. Still, thanks you for taking the time to answer, much appreciated.

jakob1379 avatar Jan 14 '26 09:01 jakob1379