lmql
lmql copied to clipboard
Supporting Google and AWS LLMs available through APIs
Having an option to use e.g. GCP Bison model instead of GPT would be really helpful. Some organizations are very attached to their cloud vendors...
Similar to Anthropic #118 models, GCP does not currently seem to offer support for model steering, i.e. masking of the model distribution during text generation: https://cloud.google.com/vertex-ai/docs/generative-ai/model-reference/text
This means, we can offer support, but advanced constraining/steering will not be available, due to API restrictions on their end.
Does any of the pretrained models in AWS Bedrock support it (A21, Titan)? How about Luminous from Aleph Alpha?
There are now models in Bedrock that return logits in different forms. Here are a couple of examples:
- Cohere's command model: https://docs.cohere.com/reference/generate
- AI21's jurassic model: https://docs.ai21.com/reference/j2-complete-ref
Sadly, Claude remains very restrictive.
I'd like to work on integrating it if its an option :)
I would also vote for this; being able to run against a Bedrock back-end would be very helpful, even for a limited subset of models.
Upvoting this! Very interested in integration for Cohere and generally Bedrock
I'm also really hoping for Cohere Command-R integration.