Fabric icon indicating copy to clipboard operation
Fabric copied to clipboard

[Feature request]: Support for OpenAI-like integration methods.

Open SimFG opened this issue 2 months ago • 4 comments

What do you need?

Perhaps providing some OpenAI-like methods could offer more choices, and perhaps it could also support the configuration method of Claude code-like bin files, such as Claude -p.

SimFG avatar Dec 22 '25 05:12 SimFG

Hi @SimFG. Thanks for opening the issue.

I’m not quite sure I understand what you’re asking for yet. When you say:

  • “OpenAI-like integration methods”
  • “Claude code-like bin files (such as claude -p)”

Could you please clarify what specific behavior you’re referring to?

Concretely, it would really help if you could describe one or more of the following:

  • A command-line flow you have in mind (exact commands, flags, expected output)
  • The configuration mechanism you’re referring to (env vars, config files, bin wrappers, etc.)
  • Which OpenAI or Claude tools you’re comparing against, and what Fabric currently lacks in that comparison
  • A short before/after example of how you’d want to use Fabric differently

Right now the request is a bit too high-level for me to map to an actionable feature, so explicit examples would help a lot.

Thanks!

ksylvan avatar Dec 22 '25 07:12 ksylvan

Thank you very much for your reply.

Currently, our fabric setup has three steps. The third step is configuring the large model. We offer many options, but providing an OpenAI-like approach would also be a good choice. Since I generally use large models, which aren't among these options, most large model providers now offer OpenAI-compatible methods (usually setting the base URL, model name, and API key), for example: https://platform.minimaxi.com/docs/api-reference/text-openai-api

Regarding the second point, the methods for integrating large models, one is through the API, but there's also a terminal method, for example: tail -f app.log | **claude -p** "Slack me if you see any anomalies appear in this log stream"

https://code.claude.com/docs/en/overview Image

SimFG avatar Dec 22 '25 08:12 SimFG

@SimFG

Thanks for the clarification. That helps a lot 👍

I think you’re actually describing two separate requests, so I’ll address them independently.

  1. OpenAI-compatible provider integrations (e.g. Minimax)

What you’re asking for here is support for OpenAI-compatible APIs where the integration consists of:

  • A base URL
  • A model name
  • an API key

Many newer providers (like Minimax) expose this surface, even if they aren’t OpenAI themselves.

That’s a provider support request, not a generic integration-method change, so please open a separate issue specifically requesting minimax provider support. That makes it actionable and keeps the scope clear.

  1. Claude Code–style Unix composability (claude -p)

For the second point: Fabric already supports this model by design.

Fabric’s philosophy is explicitly Unix-like: small, composable tools that work via stdin/stdout and shell pipelines. You don’t need a special “Claude-style” binary to do this, Fabric is that binary.

For example, the flow you showed on the screenshot maps directly to Fabric as

tail -1000 /var/log/system.log | fabric -m $MODEL_CLAUDE \
  "Do you see anything weird in this macOS system log?"

Example output:

# System Log Analysis

After reviewing this macOS system log, here are my observations:

## Normal Activity
- ASL Sender Statistics entries: routine syslogd behavior
- SSH sessions by user "kayvan"
- awdd diagnostics entry, which is normal

Fabric intentionally does not wrap this in provider-specific subcommands (claude -p, openai chat, etc.).

Instead, it provides a single, consistent CLI that works the same way regardless of provider.

Summary

  • OpenAI-compatible APIs → open a provider-specific issue (e.g., Minimax)
  • Unix-style CLI compositions are already supported, as it aligns with the core philosophy of Fabric.hy

If you have a specific workflow that you feel Fabric can’t express today (streaming behavior, callbacks, tool invocation, etc.), feel free to describe it concretely, and we can look at that. But for stdin/stdout composability, Fabric already does exactly what Claude Code is demonstrating.

ksylvan avatar Dec 22 '25 08:12 ksylvan

This is actually quite simple; it just needs to support reading the OPENAI_BASE_URL environment variable. If it's not provided, the default one will be used.

SimFG avatar Dec 22 '25 09:12 SimFG

This is actually quite simple; it just needs to support reading the OPENAI_BASE_URL environment variable. If it's not provided, the default one will be used.

You can already do that. Just select "OpenAI" as the provider and then give the MiniMax base URL (to override the default) during the fabric --setup flow.

However, supporting MiniMax is likely a few-line code change. I'll go ahead and add it to the list of supported OpenAI-compatible providers.

ksylvan avatar Dec 22 '25 14:12 ksylvan

@SimFG

Image

Coming up in the next version.

ksylvan avatar Dec 22 '25 22:12 ksylvan

@SimFG Please try version 1.4.357 and let me know that it's working for connecting to MiniMax for you.

ksylvan avatar Dec 22 '25 23:12 ksylvan