goose icon indicating copy to clipboard operation
goose copied to clipboard

Z.AI Provider

Open Neil-urk12 opened this issue 1 month ago • 1 comments

Please explain the motivation behind the feature request. Does this feature solve a particular problem you have been experiencing? What opportunities or use cases would be unlocked with this feature? Currently, Goose only supports a limited set of LLM providers, which makes it harder to experiment with or standardize on alternative ecosystems like Z.AI. Z.AI’s GLM models (such as GLM‑4.5, GLM‑4.5‑Air, and GLM‑4.6) are designed specifically for reasoning, coding, and agent-style workloads, and they expose an OpenAI‑compatible API, so they are a natural fit for Goose’s agentic workflows. Adding Z.AI as a first-class provider would let users who already rely on GLM models bring their existing API keys, quotas, and organizational setup directly into Goose instead of managing a separate provider just for this too

Describe the solution you'd like A clear and concise description of what you want to happen.

Add Z.AI as a built‑in API provider in Goose’s configuration so that it can be selected like the existing providers. Concretely, this would mean:

Supporting configuration of a Z.AI base URL and API key (for example, using the OpenAI‑compatible /chat/completions endpoint exposed by Z.AI).​

Exposing GLM models (e.g., GLM-4.5, GLM-4.5-Air, GLM-4.6) in Goose’s model list, either by default or via a simple config entry.

Describe alternatives you've considered A clear and concise description of any alternative solutions or features you've considered.

Additional context Add any other context or screenshots about the feature request here.

  • [x] I have verified this does not duplicate an existing feature request

Neil-urk12 avatar Dec 05 '25 02:12 Neil-urk12

As far as I know, Z.AI provides multiple API endpoints, depending on product line and use-case:

  • Anthropic-compatible (Claude): https://api.z.ai/api/anthropic
  • Standard OpenAI-compatible API: https://api.z.ai/api/paas/v4
  • Coding-focused OpenAI-compatible API (used by their coding/agent products): https://api.z.ai/api/coding/paas/v4

Depending on the user’s subscription plan, the available endpoints differ. This means Goose may need a way to customize or select the desired API endpoint during configuration, rather than assuming a single universal OpenAI-compatible URL.

In addition, Z.AI’s ecosystem includes two different platform variants:

  1. International platform: z.ai
  2. China Mainland platform: bigmodel.cn

Both platforms expose the same API path structure, but with different domains and different API keys, tied to different user accounts. Because of that, it may not be possible to offer a simple "one-click" setup, unless Goose provides an API-endpoint selector or a more flexible provider configuration block.

Why This Matters

This flexibility is important because:

  • Users may need to access Claude, OpenAI-compatible, or coding-optimized endpoints depending on their workflow.
  • GLM model families (GLM-4.5, GLM-4.5-Air, GLM-4.6) may be exposed differently across product lines.
  • Organizations using the Mainland platform (bigmodel.cn) cannot directly use z.ai keys, and vice versa.

Allowing a configurable base URL + model list would solve these constraints without adding unnecessary complexity to Goose’s internal implementation.

Supporting Z.AI as a built-in provider is valuable, but the provider configuration likely needs:

  • A selectable base URL (z.ai vs bigmodel.cn)
  • A selectable API family (Claude / Standard OpenAI API / Coding OpenAI API)
  • A user-provided API key
  • Optional model listing for GLM models

This would accommodate all Z.AI deployment variants while keeping Goose flexible for any future API expansions.

SuSuSoo avatar Dec 05 '25 12:12 SuSuSoo