goose
goose copied to clipboard
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
Closes #1198 This could technically be accomplished already by overriding OPENAI_URL. The intent is to just make this easier for people to setup when configuring their provider via the cli....
**Describe the bug** When tool a returns an error and the request times out I get: ``` 2025-02-13T18:09:02.673465Z ERROR goose::agents::truncate: Error: Request failed: Request failed with status: 400 Bad Request....
Given the decision to move all non-secret configuration data into `config.yaml`, it might make sense to allow UI users to look at it (and maybe edit?).
**Describe the bug** Windows builds take way to long in ci. Someone better at it than me should take a look. **Expected behavior** Build time on par with mac
Is it possible? I am not familiar with their APIs etc. Due to corporate policies it is the only LLM (service) that we can use on our codebases.
The Goose provider configurations make it difficult to use an OpenAI _"like"_ provider that is running behind some proxy somewhere or has an API that is shaped just like OpenAIs....
The community has one PR to enable vertex ai endpoint: https://github.com/block/goose/pull/1138 To use the vertex ai, we require `VERTEXAI_PROJECT_ID` and `VERTEXAI_REGION`, and right now, only "claude-3-5-sonnet-v2@20241022" and "claude-3-5-sonnet@20240620" are supported....
Right now users blindly approach the context window limit, and then run into it, breaking their current Goose session. It would be helpful to expose the remaining context window length...
**Describe the bug** Goose decided to remove the project directory to "Start Fresh" when having minor issues with commit/push to github repo. **To Reproduce** Steps to reproduce the behavior: 1....
The CLI version already offers Amazon Bedrock as a provider option, but it would be fantastic if that feature were also implemented in the Desktop App.