goose icon indicating copy to clipboard operation
goose copied to clipboard

Ubuntu CLI Add custom provider - no error, but nothing saved while Windows Git Bash saves, but then also stuck configuring

Open cbruyndoncx opened this issue 3 months ago • 10 comments

Describe the bug

Trying out the custom provider on the CLI, did similar dummy setup on GUI where it creates 2 entries (already issue raised on discord), but in the CLI nothing is created, no error, nada. See screenshot

To Reproduce Steps to reproduce the behavior:

  1. goose configure
  2. custom providers (add / remove option present)
  3. Add custom
  4. Enter minimal fields needed in order to save

Expected behavior Expect the custom provider to be visible in the list to select, but nothing shown, wondered what happend, so i thought to check if i could remove what was just added, but there it confirms, no custom provider has been saved. I was expecting something in the config.yaml, or a separate directory in the config folder, but no changes there either.

Screenshots

Image

Please provide following information:

  • OS & Arch: [Ubuntu 24.04 x86]
  • Interface: [CLI]
  • Version: [v1.7.0]
  • Extensions enabled: [n/a]
  • Provider & Model: [n/a]

cbruyndoncx avatar Sep 01 '25 08:09 cbruyndoncx

it is getting interesting, windows using the git bash environment, it looks like it is working , same v1.7.0

Image

cbruyndoncx avatar Sep 01 '25 08:09 cbruyndoncx

But when trying to use it through configure providers, it demonstrates it will not work as it refers to OPENAI_API_KEY

Image

Skipping through all options:

Image

The error is normal as it is dummy, but i am surprised by the environment variables used,

Checking the config, i have a custom_providers directory with a custom_xxx.json , that is good, the contents is

{
  "name": "custom_xxx",
  "engine": "openai",
  "display_name": "xxx",
  "description": "Custom xxx provider",
  "api_key_env": "CUSTOM_XXX_API_KEY",
  "base_url": "https://xxx.com",
  "models": [
    {
      "name": "llm",
      "context_limit": 128000,
      "input_token_cost": null,
      "output_token_cost": null,
      "currency": null,
      "supports_cache_control": null
    }
  ],
  "headers": null,
  "timeout_seconds": null,
  "supports_streaming": false
}

I was kinda expecting a host and endpoint configuration, similar to the openai set of variables.

cbruyndoncx avatar Sep 01 '25 08:09 cbruyndoncx

So i setup a second dummy custom provider for ollama compatible, with the same kind of dummy data, but the error message is different, makes me wonder why , possibly there is no model validation ?

Image

cbruyndoncx avatar Sep 01 '25 08:09 cbruyndoncx

I am also finding that running goose configure on Ubuntu 24 + bash seems to be broken. Just adding an Anthropic sonnet api_key doesn't work - I save it, then when I try to run goose it says that you need to configure a model provider. The same process works fine for me on my macbookpro + zsh.

Update: it seemed in part to be due to adding a provider (Azure) that didn't work correctly. Even though I later added Anthropic correctly, it wasn't used. Removing the config that had saved Azure, and then re-added Anthropic resolved the issue. But there does seem to be a need for better error messages / failure checking.

ck37 avatar Sep 03 '25 12:09 ck37

I have the same problem on Kubuntu 24.04. I tried on a new user and custom provider config succeeded. So I removed the 2 settings dirs below, and installed canary version with upgrade command, and could see config.yaml changing this time, but still no custom provider can be set. There must be more settings stored somewhere. Also, is there no complete uninstall script? That would be helpful for bugs like this.

/home/aimgr/.config/goose /home/aimgr/.local/share/goose/

(venv2) aimgr@ubuntu:~$ goose configure

This will update your existing config file
  if you prefer, you can edit it directly at /home/aimgr/.config/goose/config.yaml

┌   goose-configure 
│
◇  What would you like to configure?
│  Custom Providers 
│
◇  What would you like to do?
│  Add A Custom Provider 
│
◇  What type of API is this?
│  OpenAI Compatible 
│
◇  What should we call this provider?
│  chute
│
◇  Provider API URL:
│  https://llm.chutes.ai/v1/chat/completions
│
◇  API key:
│  ▪▪▪▪▪▪▪▪▪
│
◇  Available models (seperate with commas):
│  dfgdafg
│
◇  Does this provider support streaming responses?
│  Yes 
│

(venv2) aimgr@ubuntu:~$ goose configure

This will update your existing config file
  if you prefer, you can edit it directly at /home/aimgr/.config/goose/config.yaml

┌   goose-configure 
│
◇  What would you like to configure?
│  Custom Providers 
│
◇  What would you like to do?
│  Remove Custom Provider 
│
└  No custom providers added just yet.
user@ubuntu:~$ cat /home/user/.config/goose/config.yaml
OPENAI_BASE_PATH: api/v1/chat/completions
OPENAI_HOST: https://llm.chutes.ai
GOOSE_PROVIDER: custom_chutes
extensions:
  developer:
    available_tools: []
    bundled: true
    description: null
    display_name: Developer Tools
    enabled: true
    name: developer
    timeout: 30
    type: builtin
  memory:
    available_tools: []
    bundled: true
    description: null
    display_name: Memory
    enabled: true
    name: memory
    timeout: 30
    type: builtin
GOOSE_MODEL: Qwen/Qwen3-Coder-480B-A35B-Instruct-FP8
user@ubuntu:~$ cat /home/aimgr/.config/goose/config.yaml
VENICE_BASE_PATH: api/v1/chat/completions
GOOSE_MODE: smart_approve
GOOSE_PROVIDER: venice
GOOSE_MODEL: llama-3.2-3b [fsw]
extensions:
  developer:
    available_tools: []
    bundled: true
    description: null
    display_name: Developer Tools
    enabled: true
    name: developer
    timeout: 30
    type: builtin
  memory:
    available_tools: []
    bundled: true
    description: null
    display_name: Memory
    enabled: true
    name: memory
    timeout: 30
    type: builtin
VENICE_HOST: https://api.venice.ai
VENICE_MODELS_PATH: api/v1/models

auwsom avatar Sep 04 '25 18:09 auwsom

i just confirmed again that adding the config from the working config.yaml to the non-working one does not change the config file enough to make a custom provider work.

  error: Error Unknown provider: custom_chutes.
Please check your system keychain and run 'goose configure' again.
If your system is unable to use the keyring, please try setting secret key(s) via environment variables.
For more info, see: https://block.github.io/goose/docs/troubleshooting/#keychainkeyring-errors

then, after adding OPEN_AI_KEY for chutes, i got the error below and then my config was "pleasantly" mostly reset back to a non-working state.

thread 'main' panicked at crates/goose/src/token_counter.rs:93:82:
no entry found for key
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
VENICE_HOST: https://api.venice.ai
extensions:
  developer:
    available_tools: []
    bundled: true
    description: null
    display_name: Developer Tools
    enabled: true
    name: developer
    timeout: 30
    type: builtin
  memory:
    available_tools: []
    bundled: true
    description: null
    display_name: Memory
    enabled: true
    name: memory
    timeout: 30
    type: builtin
GOOSE_PROVIDER: venice
GOOSE_MODEL: llama-3.2-3b [fsw]
VENICE_MODELS_PATH: api/v1/models
VENICE_BASE_PATH: api/v1/chat/completions

auwsom avatar Sep 04 '25 18:09 auwsom

sidenote: i am confused why, when i go to set openrouter back as the default provider, i get the following, which then exits, without setting the provider, unless i select "no" to saving the key in the keyring, which is not obvious and only found by trial and error:

(venv2) aimgr@ubuntu:~$ goose configure

This will update your existing config file
  if you prefer, you can edit it directly at /home/aimgr/.config/goose/config.yaml

┌   goose-configure 
│
◇  What would you like to configure?
│  Configure Providers 
│
◇  Which model provider should we use?
│  OpenRouter 
│
●  OPENROUTER_API_KEY is set via environment variable
│  
◇  Would you like to save this value to your keyring?
│  Yes 
│
(venv2) aimgr@ubuntu:~$

auwsom avatar Sep 04 '25 19:09 auwsom

As a FYI,

  1. on ubuntu i do not use keyring, so i keep (easy) visibility on my settings. I use my bashrc file where possible.
  2. I am further testing the custom providers, i fixed the double entries in the UI, but there is still a bug that when there are 2 providers, the settings of the first custom provider are used by the second provider. So it just does not work. Possibly the same root cause in the CLI. Some of the code is in the goose-server (goosed) so it is shared, but i am not 100% sure yet, i am just learning how it all works and coding with goose and building/debugging myself ...

cbruyndoncx avatar Sep 05 '25 10:09 cbruyndoncx

Just an update that for some reason I was able to change to a custom provider. It could have been from updating to canary and then updating again to stable. Still had to use the routine of setting Custom Providers and then going back to Configure Providers and use OpenAI selection. This was after setting the GOOSE_PROVIDER: openai, so maybe that's the special sauce. This time I was trying Phala provider and set OPENAI_API_KEY= to my phala key (and of course source ~/.env). HTH

GOOSE_PROVIDER: openai OPENAI_BASE_PATH: v1/chat/completions OPENAI_HOST: https://api.redpill.ai GOOSE_MODEL: phala/qwen3-coder

auwsom avatar Sep 09 '25 19:09 auwsom

Tested CLI v1.11.1 on ubuntu - CLI only, no GUI install The custom provider bit works, tested with OpenAI compatible (configuring 'LikeOpenrouter', as for testing openai compatibility that is a reliable party.

But it still shows the OPENAI ENV variables running through these menus, I generally ignore/dont save, but not sure what happens if you save to keyring and possible overwrite good values.

So some tweaking is still needed in that section @DOsinga

Image

cbruyndoncx avatar Oct 21 '25 20:10 cbruyndoncx