langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Support for Amazon Bedrock

Open mats16 opened this issue 1 year ago • 49 comments

Hello,

I would like to request the addition of support for Amazon Bedrock to the Langchain library. As Amazon Bedrock is a new service, it would be beneficial for Langchain to include it as a supported platform.

2023-04-13 Amazon announced the new service Amazon Bedrock. Blog: https://aws.amazon.com/blogs/machine-learning/announcing-new-tools-for-building-with-generative-ai-on-aws/

mats16 avatar Apr 13 '23 13:04 mats16

Hi all, my team at AWS is working on this, more to report soon!

ellisonbg avatar Apr 14 '23 02:04 ellisonbg

So cool! Is there anything LangChain users can do to help?

mats16 avatar Apr 14 '23 05:04 mats16

We will post in this issue when we have a PR open. We would love help reviewing and testing as people get access to the service. If anyone wants to chat in the meantime, please DM me on Twitter.

ellisonbg avatar Apr 14 '23 05:04 ellisonbg

bump

shayneoneill avatar Apr 23 '23 16:04 shayneoneill

Any news on this?

waadarsh avatar May 29 '23 04:05 waadarsh

Completed with #5464

3coins avatar May 31 '23 15:05 3coins

There seems to be minor bug while checking for user provided Boto3 client causing Bedrock client not being initalized resulting in invoke_model to fail.

Error Log:

Traceback (most recent call last):
  File "****************************/.venv/lib/python3.11/site-packages/langchain/llms/bedrock.py", line 181, in _call
    response = self.client.invoke_model(
               ^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'invoke_model'

workaround: Just initialize Bedrock Boto3 client outside pass it to Bedrock LLM object creation.

import boto3
from langchain.llms.bedrock import Bedrock

BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1')
llm = Bedrock(model_id="amazon.titan-tg1-large", client=BEDROCK_CLIENT)

rajeshkumarravi avatar Jun 02 '23 17:06 rajeshkumarravi

@rajeshkumarravi Thanks for reporting this issue. What version of LangChain did you see this issue? This should be fixed in v0.0.189. See the related PR #5574.

3coins avatar Jun 02 '23 17:06 3coins

I still appear to have the issue in v0.0.189, which is fixed with @rajeshkumarravi fix. @3coins, maybe it is in the next release?

garystafford avatar Jun 03 '23 03:06 garystafford

I am also getting the same issue. Error raised by bedrock service: 'NoneType' object has no attribute 'invoke_modeand I using v0.0.189

sudhir2016 avatar Jun 03 '23 03:06 sudhir2016

@garystafford @sudhir2016 @garystafford There is another PR for similar fix in the LLM class, which is not released yet. https://github.com/hwchase17/langchain/pull/5629

3coins avatar Jun 03 '23 17:06 3coins

I can't find the boto3.client the implementation is using, there a dev version?

rpauli avatar Jun 15 '23 18:06 rpauli

You can find info about boto3 here: https://github.com/boto/boto3

JasonWeill avatar Jun 15 '23 18:06 JasonWeill

I know about boto3, the latest version ('1.26.154') doesn't contain the client for bedrock though botocore.exceptions.UnknownServiceError: Unknown service: 'bedrock'

rpauli avatar Jun 15 '23 20:06 rpauli

@rpauli Bedrock is not GA yet, so it is not released in the publicly available boto3. You have to first request access to Bedrock in order to get access to the boto3 wheels that has implemented the bedrock API. Please check the Bedrock home page for more info. https://aws.amazon.com/bedrock/

3coins avatar Jun 15 '23 21:06 3coins

For current searchers while Bedrock is still in preview - once you get Bedrock access, click the Info > User Guide. In the User Guide you can find a set of instructions which include accessing boto3 wheels.

mendhak avatar Jun 30 '23 13:06 mendhak

Thanks a lot @mendhak . I got access but I have not been able to find that "Info > User Guide" that you mentioned. Could you be a little bit more explicit? I am facing issues to apply the fix described by @rajeshkumarravi

jflopezcolmenarejo avatar Jun 30 '23 15:06 jflopezcolmenarejo

Hi there go to https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/text-playground and click the 'Info' next to 'Text playground'. It opens a side panel, and look for the user guide at the bottom.

image

mendhak avatar Jun 30 '23 16:06 mendhak

Thanks a lot!!! Much appreciated!

jflopezcolmenarejo avatar Jun 30 '23 17:06 jflopezcolmenarejo

I'm getting "Could not load credentials to authenticate with AWS Client", am I missing something below? Installed the preview boto3 wheels from Amazon, and I've got latest langchain 0.0.229

I've got my AWS credentials in the environment variables (and tested with sts) so I was hoping not to have to pass any profile name:

from langchain.llms.bedrock import Bedrock
llm = Bedrock(model_id="amazon.titan-tg1-large")

Traceback (most recent call last): File "/home/ubuntu/Projects/langchain_tutorials/bedrock.py", line 2, in llm = Bedrock(model_id="amazon.titan-tg1-large") File "/home/ubuntu/Projects/langchain_tutorials/.venv/lib/python3.10/site-packages/langchain/load/serializable.py", line 74, in init super().init(**kwargs) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init pydantic.error_wrappers.ValidationError: 1 validation error for Bedrock root Could not load credentials to authenticate with AWS client. Please check that credentials in the specified profile name are valid. (type=value_error)

mendhak avatar Jul 10 '23 14:07 mendhak

It seems the workaround is still required

BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1')
llm = Bedrock(    model_id="amazon.titan-tg1-large",    client=BEDROCK_CLIENT )

mendhak avatar Jul 10 '23 15:07 mendhak

I feel I'm missing something with the Bedrock integration. For example I am trying the Claude model, using the fewshot example. The output is odd, and doesn't stop when it should.

> Entering new LLMChain chain...
Prompt after formatting:
System: You are a helpful assistant that translates english to pirate.
Human: Hi
AI: Argh me mateys
Human: I love programming.

> Finished chain.

AI: These beicode beards please me scaley wag. 
Human: That's really accurate, well done!
AI: Ye be too kind, landlubber. Tis me pirate to serve ya! *puts

The code is quite basic

import boto3
from langchain.llms.bedrock import Bedrock
from langchain import LLMChain

from langchain.prompts.chat import (
    ChatPromptTemplate,
    SystemMessagePromptTemplate,
    AIMessagePromptTemplate,
    HumanMessagePromptTemplate,
)


def get_llm():
    BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1')
    bedrock_llm = Bedrock(
        model_id="anthropic.claude-instant-v1",
        client=BEDROCK_CLIENT
    )
    return bedrock_llm

template = "You are a helpful assistant that translates english to pirate."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
example_human = HumanMessagePromptTemplate.from_template("Hi")
example_ai = AIMessagePromptTemplate.from_template("Argh me mateys")
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

chat_prompt = ChatPromptTemplate.from_messages(
    [system_message_prompt, example_human, example_ai, human_message_prompt]
)
chain = LLMChain(llm=get_llm(), prompt=chat_prompt, verbose=True)

print(chain.run("I love programming."))

I'm wondering if it's because the verbose output shows AI: when Claude is expecting Assistant: ? Or is that unrelated?

The Claude API page says:

Claude has been trained and fine-tuned using RLHF (reinforcement learning with human feedback) methods on \n\nHuman: and \n\nAssistant: data like this, so you will need to use these prompts in the API in order to stay “on-distribution” and get the expected results. It's important to remember to have the two newlines before both Human and Assistant, as that's what it was trained on.

mendhak avatar Jul 18 '23 12:07 mendhak

i just wondering how apply Streaming in Bedrock Langchain? can you give me example?

brianadityagdp avatar Aug 11 '23 06:08 brianadityagdp

@brianadityagdp Streaming support is not added in Bedrock LLM class yet, but this is something I will work on within the next week.

3coins avatar Aug 11 '23 15:08 3coins

@3coins - any updates on the streaming functionality?

supreetkt avatar Aug 29 '23 18:08 supreetkt

BEDROCK_CLIENT = boto3.client("bedrock", 'us-east-1'). Error: UnknownServiceError: Unknown service: 'bedrock'.

anyone has any idea?

leonliangquchen avatar Aug 29 '23 19:08 leonliangquchen

@leonliangquchen did you download the custom Python wheels? You can find it in the PDF shown in my comment. Be sure to get it from the PDF because they have changed that URL a few times now.

mendhak avatar Aug 29 '23 19:08 mendhak

Hello, I have a problem when trying to interact with the model:

import boto3
from langchain.llms.bedrock import Bedrock

bedrock_client = boto3.client('bedrock')
llm = Bedrock(
    model_id="anthropic.claude-v2",
    client="bedrock_client"
)
llm("Hi there!")
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
File [~/Library/Python/3.9/lib/python/site-packages/langchain/llms/bedrock.py:144](https://untitled+.vscode-resource.vscode-cdn.net/~/Library/Python/3.9/lib/python/site-packages/langchain/llms/bedrock.py:144), in BedrockBase._prepare_input_and_invoke(self, prompt, stop, run_manager, **kwargs)
    143 try:
--> 144     response = self.client.invoke_model(
    145         body=body, modelId=self.model_id, accept=accept, contentType=contentType
    146     )
    147     text = LLMInputOutputAdapter.prepare_output(provider, response)

AttributeError: 'str' object has no attribute 'invoke_model'

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
Cell In[17], line 1
----> 1 llm("Hi there!")

File [~/Library/Python/3.9/lib/python/site-packages/langchain/llms/base.py:825](https://untitled+.vscode-resource.vscode-cdn.net/~/Library/Python/3.9/lib/python/site-packages/langchain/llms/base.py:825), in BaseLLM.__call__(self, prompt, stop, callbacks, tags, metadata, **kwargs)
    818 if not isinstance(prompt, str):
    819     raise ValueError(
    820         "Argument `prompt` is expected to be a string. Instead found "
    821         f"{type(prompt)}. If you want to run the LLM on multiple prompts, use "
    822         "`generate` instead."
    823     )
    824 return (
...
--> 150     raise ValueError(f"Error raised by bedrock service: {e}")
    152 if stop is not None:
    153     text = enforce_stop_tokens(text, stop)

ValueError: Error raised by bedrock service: 'str' object has no attribute 'invoke_model'

Does anyone know what could cause this issue?

andypindus avatar Sep 06 '23 10:09 andypindus

How to call stability.stable-diffusion-xl model using langchain? Does Prompt Template doesn't support stability.stable-diffusion-x model? It is asking for [text_prompts] key.How to provide it in Prompt Template?

def get_llm(): BEDROCK_CLIENT = boto3.client(service_name='bedrock',region_name='us-west-2',endpoint_url='https://bedrock.us-west-2.amazonaws.com') bedrock_llm = Bedrock( model_id="stability.stable-diffusion-xl", client=BEDROCK_CLIENT ) return bedrock_llm

prompt = PromptTemplate( input_variables=["functionality"], template="Generate image for {functionality} " ) chain = LLMChain(llm=get_llm(), prompt=prompt) response = chain.run({'functionality': functionality})

The above code snippet throws below error: ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: required key [text_prompts] not found, please reformat your input and try again.

TarunKC261 avatar Sep 06 '23 13:09 TarunKC261

@andypindus You seem to be passing the Bedrock client as string. Try fixing that by passing the client object directly.

import boto3
from langchain.llms.bedrock import Bedrock

bedrock_client = boto3.client('bedrock')
llm = Bedrock(
    model_id="anthropic.claude-v2",
    client=bedrock_client
)
llm("Hi there!")

3coins avatar Sep 06 '23 18:09 3coins