gateway
gateway copied to clipboard
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
### What Would You Like to See with the Gateway? Straico API documentation; https://documenter.getpostman.com/view/5900072/2s9YyzddrR Official website: https://straico.com/ Also see: https://github.com/Gunther-Schulz/openai-straico-api-bridge ### Context for your Request _No response_ ### Your Twitter/LinkedIn...
### What Happened? When it's instantiated a client for portkey, and invoked the chat.completions.create function, we found some possible problems when we configure the fallback process. 1. No matter if...
### What Would You Like to See with the Gateway? I would like to be able to start the server using Node.js SDK. This would require exposing the script that's...
### What Would You Like to See with the Gateway? https://docs.hyperbolic.xyz/docs/getting-started ### Context for your Request _No response_ ### Your Twitter/LinkedIn _No response_
### What Would You Like to See with the Gateway? https://docs.sarvam.ai/api-reference-docs/combined_usage_guide I would like to add this, please assign it to me. ### Context for your Request _No response_ ###...
### What Would You Like to See with the Gateway? https://infermatic.ai/for-developers/ ### Context for your Request _No response_ ### Your Twitter/LinkedIn _No response_
https://github.com/Portkey-AI/gateway/blob/b93034ca8954b8a9b9dcc9ef52880c3d00b26ca5/.github/README.cn.md?plain=1#L175 https://github.com/Portkey-AI/gateway/blob/b93034ca8954b8a9b9dcc9ef52880c3d00b26ca5/.github/README.jp.md?plain=1#L109 This outdated project currently has a maintained fork at https://github.com/Lambdua/openai4j. It's recommended to update the markdown to make it easier for Java users to integrate with.
### What Happened? Provider - `azure-ai` Example model - `mistral-large` According to the new Azure AI Studio, AI inferencing models URL have changed to - `https://resource-name.openai.azure.com/` but the code still...
### What Happened? To get the usage object, the user has to pass `streaming_options: { include_usage: true}` This can be default behaviour ### What Should Have Happened? _No response_ ###...
To support models on HF's dedicated inference such as this: https://huggingface.co/black-forest-labs/FLUX.1-dev