langchain
langchain copied to clipboard
Support for Amazon Bedrock
Hello,
I would like to request the addition of support for Amazon Bedrock to the Langchain library. As Amazon Bedrock is a new service, it would be beneficial for Langchain to include it as a supported platform.
2023-04-13 Amazon announced the new service Amazon Bedrock. Blog: https://aws.amazon.com/blogs/machine-learning/announcing-new-tools-for-building-with-generative-ai-on-aws/
Hi all, my team at AWS is working on this, more to report soon!
So cool! Is there anything LangChain users can do to help?
We will post in this issue when we have a PR open. We would love help reviewing and testing as people get access to the service. If anyone wants to chat in the meantime, please DM me on Twitter.
bump
Any news on this?
Completed with #5464
There seems to be minor bug while checking for user provided Boto3 client causing Bedrock client not being initalized resulting in invoke_model
to fail.
Error Log:
Traceback (most recent call last):
File "****************************/.venv/lib/python3.11/site-packages/langchain/llms/bedrock.py", line 181, in _call
response = self.client.invoke_model(
^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'invoke_model'
workaround: Just initialize Bedrock Boto3 client outside pass it to Bedrock LLM object creation.
import boto3
from langchain.llms.bedrock import Bedrock
BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1')
llm = Bedrock(model_id="amazon.titan-tg1-large", client=BEDROCK_CLIENT)
@rajeshkumarravi Thanks for reporting this issue. What version of LangChain did you see this issue? This should be fixed in v0.0.189. See the related PR #5574.
I still appear to have the issue in v0.0.189, which is fixed with @rajeshkumarravi fix. @3coins, maybe it is in the next release?
I am also getting the same issue. Error raised by bedrock service: 'NoneType' object has no attribute 'invoke_modeand I using v0.0.189
@garystafford @sudhir2016 @garystafford There is another PR for similar fix in the LLM class, which is not released yet. https://github.com/hwchase17/langchain/pull/5629
I can't find the boto3.client the implementation is using, there a dev version?
You can find info about boto3 here: https://github.com/boto/boto3
I know about boto3, the latest version ('1.26.154') doesn't contain the client for bedrock though
botocore.exceptions.UnknownServiceError: Unknown service: 'bedrock'
@rpauli
Bedrock is not GA yet, so it is not released in the publicly available boto3
. You have to first request access to Bedrock in order to get access to the boto3
wheels that has implemented the bedrock
API. Please check the Bedrock home page for more info.
https://aws.amazon.com/bedrock/
For current searchers while Bedrock is still in preview - once you get Bedrock access, click the Info > User Guide. In the User Guide you can find a set of instructions which include accessing boto3 wheels.
Thanks a lot @mendhak . I got access but I have not been able to find that "Info > User Guide" that you mentioned. Could you be a little bit more explicit? I am facing issues to apply the fix described by @rajeshkumarravi
Hi there go to https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/text-playground and click the 'Info' next to 'Text playground'. It opens a side panel, and look for the user guide at the bottom.
Thanks a lot!!! Much appreciated!
I'm getting "Could not load credentials to authenticate with AWS Client", am I missing something below? Installed the preview boto3 wheels from Amazon, and I've got latest langchain 0.0.229
I've got my AWS credentials in the environment variables (and tested with sts) so I was hoping not to have to pass any profile name:
from langchain.llms.bedrock import Bedrock
llm = Bedrock(model_id="amazon.titan-tg1-large")
Traceback (most recent call last): File "/home/ubuntu/Projects/langchain_tutorials/bedrock.py", line 2, in
llm = Bedrock(model_id="amazon.titan-tg1-large") File "/home/ubuntu/Projects/langchain_tutorials/.venv/lib/python3.10/site-packages/langchain/load/serializable.py", line 74, in init super().init(**kwargs) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init pydantic.error_wrappers.ValidationError: 1 validation error for Bedrock root Could not load credentials to authenticate with AWS client. Please check that credentials in the specified profile name are valid. (type=value_error)
It seems the workaround is still required
BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1')
llm = Bedrock( model_id="amazon.titan-tg1-large", client=BEDROCK_CLIENT )
I feel I'm missing something with the Bedrock integration. For example I am trying the Claude model, using the fewshot example. The output is odd, and doesn't stop when it should.
> Entering new LLMChain chain...
Prompt after formatting:
System: You are a helpful assistant that translates english to pirate.
Human: Hi
AI: Argh me mateys
Human: I love programming.
> Finished chain.
AI: These beicode beards please me scaley wag.
Human: That's really accurate, well done!
AI: Ye be too kind, landlubber. Tis me pirate to serve ya! *puts
The code is quite basic
import boto3
from langchain.llms.bedrock import Bedrock
from langchain import LLMChain
from langchain.prompts.chat import (
ChatPromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate,
HumanMessagePromptTemplate,
)
def get_llm():
BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1')
bedrock_llm = Bedrock(
model_id="anthropic.claude-instant-v1",
client=BEDROCK_CLIENT
)
return bedrock_llm
template = "You are a helpful assistant that translates english to pirate."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
example_human = HumanMessagePromptTemplate.from_template("Hi")
example_ai = AIMessagePromptTemplate.from_template("Argh me mateys")
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
chat_prompt = ChatPromptTemplate.from_messages(
[system_message_prompt, example_human, example_ai, human_message_prompt]
)
chain = LLMChain(llm=get_llm(), prompt=chat_prompt, verbose=True)
print(chain.run("I love programming."))
I'm wondering if it's because the verbose output shows AI:
when Claude is expecting Assistant:
? Or is that unrelated?
The Claude API page says:
Claude has been trained and fine-tuned using RLHF (reinforcement learning with human feedback) methods on \n\nHuman: and \n\nAssistant: data like this, so you will need to use these prompts in the API in order to stay “on-distribution” and get the expected results. It's important to remember to have the two newlines before both Human and Assistant, as that's what it was trained on.
i just wondering how apply Streaming in Bedrock Langchain? can you give me example?
@brianadityagdp Streaming support is not added in Bedrock LLM class yet, but this is something I will work on within the next week.
@3coins - any updates on the streaming functionality?
BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1'). Error: UnknownServiceError: Unknown service: 'bedrock'.
anyone has any idea?
@leonliangquchen did you download the custom Python wheels? You can find it in the PDF shown in my comment. Be sure to get it from the PDF because they have changed that URL a few times now.
Hello, I have a problem when trying to interact with the model:
import boto3
from langchain.llms.bedrock import Bedrock
bedrock_client = boto3.client('bedrock')
llm = Bedrock(
model_id="anthropic.claude-v2",
client="bedrock_client"
)
llm("Hi there!")
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
File [~/Library/Python/3.9/lib/python/site-packages/langchain/llms/bedrock.py:144](https://untitled+.vscode-resource.vscode-cdn.net/~/Library/Python/3.9/lib/python/site-packages/langchain/llms/bedrock.py:144), in BedrockBase._prepare_input_and_invoke(self, prompt, stop, run_manager, **kwargs)
143 try:
--> 144 response = self.client.invoke_model(
145 body=body, modelId=self.model_id, accept=accept, contentType=contentType
146 )
147 text = LLMInputOutputAdapter.prepare_output(provider, response)
AttributeError: 'str' object has no attribute 'invoke_model'
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
Cell In[17], line 1
----> 1 llm("Hi there!")
File [~/Library/Python/3.9/lib/python/site-packages/langchain/llms/base.py:825](https://untitled+.vscode-resource.vscode-cdn.net/~/Library/Python/3.9/lib/python/site-packages/langchain/llms/base.py:825), in BaseLLM.__call__(self, prompt, stop, callbacks, tags, metadata, **kwargs)
818 if not isinstance(prompt, str):
819 raise ValueError(
820 "Argument `prompt` is expected to be a string. Instead found "
821 f"{type(prompt)}. If you want to run the LLM on multiple prompts, use "
822 "`generate` instead."
823 )
824 return (
...
--> 150 raise ValueError(f"Error raised by bedrock service: {e}")
152 if stop is not None:
153 text = enforce_stop_tokens(text, stop)
ValueError: Error raised by bedrock service: 'str' object has no attribute 'invoke_model'
Does anyone know what could cause this issue?
How to call stability.stable-diffusion-xl model using langchain? Does Prompt Template doesn't support stability.stable-diffusion-x model? It is asking for [text_prompts] key.How to provide it in Prompt Template?
def get_llm(): BEDROCK_CLIENT = boto3.client(service_name='bedrock',region_name='us-west-2',endpoint_url='https://bedrock.us-west-2.amazonaws.com') bedrock_llm = Bedrock( model_id="stability.stable-diffusion-xl", client=BEDROCK_CLIENT ) return bedrock_llm
prompt = PromptTemplate( input_variables=["functionality"], template="Generate image for {functionality} " ) chain = LLMChain(llm=get_llm(), prompt=prompt) response = chain.run({'functionality': functionality})
The above code snippet throws below error: ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: required key [text_prompts] not found, please reformat your input and try again.
@andypindus You seem to be passing the Bedrock client as string. Try fixing that by passing the client object directly.
import boto3
from langchain.llms.bedrock import Bedrock
bedrock_client = boto3.client('bedrock')
llm = Bedrock(
model_id="anthropic.claude-v2",
client=bedrock_client
)
llm("Hi there!")