BricksLLM icon indicating copy to clipboard operation
BricksLLM copied to clipboard

🔒 Enterprise-grade API gateway that helps you monitor and impose cost or rate limits per API key. Get fine-grained access control and monitoring per user, application, or environment. Supports OpenAI...

Results 16 BricksLLM issues
Sort by recently updated
recently updated
newest added

Hi @spikelu2016 , this PR changes all the correlation id log related code, both admin and proxy. Now the sub logger with correlation id is stored inside the http request...

Please consider adding openAPI documentation for the endpoints and make it available on e.g. the /swagger path. I'm happy to see frequent updates to this awesome project but it can...

How can I delete a custom provider or provider settings ?

Currently, BricksLLM proxy only has basic key authentication. Oauth could help improve the security of proxy endpoints.

enhancement

Currently internal errors in BricksLLM returns error response using OpenAI's format. This might cause issues for Anthropic SDKs

bug

Gemini has been released. Add support for Gemini endpoints.

enhancement
help wanted

``` Is it possible to define rate limits at the granularity of the OpenAI API? I.e., different RPM/TPM for each model? Context is that we want to give 100 students...

enhancement

Can the failover feature only be supported by OpenAI and Azure? Can't we support VLLM suppliers? Is there a load balancing function between two identical LLMs? So, how should I...

Currently when a path doesn't match, we just return `[BricksLLM] route not supported`, which makes it slightly tricky to debug API calls made via an SDK, cause the SDK hide...

Allow selection of specific 'azure' provider for custom routes based on provider name