goose icon indicating copy to clipboard operation
goose copied to clipboard

Getting started documentation does insufficiently document provider parameters

Open cruftex opened this issue 2 months ago • 9 comments

Describe the bug

Since I cannot get the CLI configuration to work (see: #5137), I tried to set the variables in the configuration file by myself.

To Reproduce

According to the documentation in "Getting started" https://block.github.io/goose/docs/getting-started/providers the only required parameter is OPENAI_KEY. When executing goose configure it says:

/home/jeans/.config/goose/config.yaml

Also there is a sample configuration:

OPENAI_HOST=https://your-vllm-endpoint.internal
OPENAI_API_KEY=your-internal-api-key

Is that yaml or should I set environment variables?

I put in the config.yaml:

OPENAI_HOST: "http://localhost:12 34"
OPENAI_API_KEY: "ignore"

Goose says there is no provider.

goose session

thread 'main' panicked at /home/runner/work/goose/goose/crates/goose-cli/src/session/builder.rs:190:10:
No provider configured. Run 'goose configure' first
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Expected behaviour

A minimal working set of parameters is shown in Getting started

No confusion about where to put parameters and which syntax to use

The documentation section "Configure LLM Provider" in Getting Started should document everything needed

Please provide following information:

  • OS & Arch: Ubuntu 24.04
  • Interface: cli
  • Version: 1.9.3
  • Extensions enabled: -
  • Provider & Model: local

cruftex avatar Oct 12 '25 10:10 cruftex

you need to add a line like:

GOOSE_PROVIDER: openai

DOsinga avatar Oct 12 '25 16:10 DOsinga

@blackgirlbytes, can I work on it?

Lymah123 avatar Oct 14 '25 15:10 Lymah123

yes you can! Thanks for asking @Lymah123

blackgirlbytes avatar Oct 14 '25 20:10 blackgirlbytes

dude this has been an issue, quick workaround for me was:

in your working terminal:

export OPENAI_KEY=

now to persist just add this to wherever goose is located or local env is set. If this is fixed its good but honestly no need for docs here @blackgirlbytes , its clearly an issue to resolve rather than provide a workaround - why env is not getting set via goose configure directly

Describe the bug

Since I cannot get the CLI configuration to work (see: #5137), I tried to set the variables in the configuration file by myself.

To Reproduce

According to the documentation in "Getting started" https://block.github.io/goose/docs/getting-started/providers the only required parameter is OPENAI_KEY. When executing goose configure it says:

/home/jeans/.config/goose/config.yaml

Also there is a sample configuration:

OPENAI_HOST=https://your-vllm-endpoint.internal
OPENAI_API_KEY=your-internal-api-key

Is that yaml or should I set environment variables?

I put in the config.yaml:

OPENAI_HOST: "http://localhost:12 34"
OPENAI_API_KEY: "ignore"

Goose says there is no provider.

goose session

thread 'main' panicked at /home/runner/work/goose/goose/crates/goose-cli/src/session/builder.rs:190:10:
No provider configured. Run 'goose configure' first
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Expected behaviour

A minimal working set of parameters is shown in Getting started

No confusion about where to put parameters and which syntax to use

The documentation section "Configure LLM Provider" in Getting Started should document everything needed

Please provide following information:

  • OS & Arch: Ubuntu 24.04
  • Interface: cli
  • Version: 1.9.3
  • Extensions enabled: -
  • Provider & Model: local

ARYPROGRAMMER avatar Oct 22 '25 13:10 ARYPROGRAMMER

Thank you so much for clarity @ARYPROGRAMMER and checking, I agree on this. Thank you!

taniandjerry avatar Oct 22 '25 15:10 taniandjerry

Reopening this for my team to address

blackgirlbytes avatar Nov 06 '25 14:11 blackgirlbytes

Thanks @ARYPROGRAMMER ! If anything..we can put this in the troubleshooting guide.

blackgirlbytes avatar Nov 06 '25 14:11 blackgirlbytes

I was wondering the same. It would definitely make sense to be able to define OPENAI_API_KEY in config.yaml like this:

OPENAI_API_KEY: "ignore"

The OpenAI compatible model configuration is very unintuitive altogether, unfortunately. I hope that whole flow would be redesigned.

Anyway, thanks for a great agent!

erkkimon avatar Nov 06 '25 21:11 erkkimon

Thanks @ARYPROGRAMMER ! If anything..we can put this in the troubleshooting guide.

good strat for short term, u will need a better architecture to solve this in the way @erkkimon is describing

ARYPROGRAMMER avatar Nov 06 '25 22:11 ARYPROGRAMMER