langflow icon indicating copy to clipboard operation
langflow copied to clipboard

botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: Validation Error

Open selvabharathG opened this issue 1 year ago • 1 comments

Bug Description

I am trying to Build an AI application with RAG enabled using Langchain framework. below is the source code of my application and I am getting error when running it, no compilation error.

Error message

PS C:\Users\Selvabharath>

                    > & C:/Users/Selvabharath/AppData/Local/Programs/Python/Python311/python.exe rag_backend._without_frontpy.py

a Traceback (most recent call last): File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_aws\llms\bedrock.py", line 690, in _prepare_input_and_invoke response = self.client.invoke_model(**request_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\botocore\client.py", line 565, in _api_call return self._make_api_call(operation_name, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\botocore\client.py", line 1017, in _make_api_call raise error_class(parsed_response, operation_name) botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: Validation Error

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "c:\Users\Selvabharath\Documents\Selva\9_Work\Python_Projects\Demo_Sean\rag_backend._without_frontpy.py", line 45, in response_content = rag_response(rag_index(), question="what is Underlying earnings per share") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Users\Selvabharath\Documents\Selva\9_Work\Python_Projects\Demo_Sean\rag_backend._without_frontpy.py", line 42, in rag_response rag_query=index.query(question=question,llm=ragllm) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\indexes\vectorstore.py", line 50, in query return chain.invoke({chain.input_key: question})[chain.output_key] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 166, in invoke raise e File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 156, in invoke self._call(inputs, run_manager=run_manager) File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\retrieval_qa\base.py", line 146, in _call answer = self.combine_documents_chain.run( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core_api\deprecation.py", line 168, in warning_emitting_wrapper return wrapped(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 605, in run return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core_api\deprecation.py", line 168, in warning_emitting_wrapper return wrapped(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 383, in call return self.invoke( ^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 166, in invoke raise e File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 156, in invoke self._call(inputs, run_manager=run_manager) File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\combine_documents\base.py", line 138, in _call output, extra_return_dict = self.combine_docs( ^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\combine_documents\stuff.py", line 249, in combine_docs return self.llm_chain.predict(callbacks=callbacks, **inputs), {} ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\llm.py", line 318, in predict return self(kwargs, callbacks=callbacks)[self.output_key] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core_api\deprecation.py", line 168, in warning_emitting_wrapper return wrapped(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 383, in call return self.invoke( ^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 166, in invoke raise e File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 156, in invoke self._call(inputs, run_manager=run_manager) File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\llm.py", line 128, in _call response = self.generate([inputs], run_manager=run_manager) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\llm.py", line 140, in generate return self.llm.generate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core\language_models\llms.py", line 703, in generate_prompt return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core\language_models\llms.py", line 882, in generate output = self._generate_helper( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core\language_models\llms.py", line 740, in _generate_helper raise e File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core\language_models\llms.py", line 727, in _generate_helper self._generate( File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_core\language_models\llms.py", line 1431, in _generate self._call(prompt, stop=stop, run_manager=run_manager, **kwargs) File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_aws\llms\bedrock.py", line 1052, in _call text, tool_calls, llm_output = self._prepare_input_and_invoke( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_aws\llms\bedrock.py", line 701, in _prepare_input_and_invoke raise ValueError(f"Error raised by bedrock service: {e}") ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: Validation Error PS C:\Users\Selvabharath>

Reproduction

import os from langchain_community.document_loaders import PyPDFLoader from langchain.text_splitter import RecursiveCharacterTextSplitter from langchain_community.embeddings import BedrockEmbeddings from langchain_community.vectorstores import FAISS from langchain.indexes import VectorstoreIndexCreator #from langchain_community.llms import Bedrock from langchain_aws import BedrockLLM print("a")

def rag_index(): data_load=PyPDFLoader('C:/Users/Selvabharath/Documents/Selva/9_Work/AI_with_RAG/Data Source/sample_input.pdf') ## location of the PDF file data_split=RecursiveCharacterTextSplitter(separators=["\n\n", "\n", " ", ""], chunk_size=100, chunk_overlap=10) ## Defining chunks

data_embeddings=BedrockEmbeddings(
    credentials_profile_name='default', ## aws profile name
    model_id = 'amazon.titan-embed-text-v1')  ## what embeddings to use from Bedrock

data_index=VectorstoreIndexCreator(
    text_splitter=data_split, ## data split
    embedding=data_embeddings, ## embedding
    vectorstore_cls=FAISS ## vector store
    ) ##--> Index is created

db_index=data_index.from_loaders([data_load]) ## input is passed to create index
return db_index

def rag_llm(): llm=BedrockLLM( credentials_profile_name='default', ## aws profile name model_id = 'mistral.mistral-7b-instruct-v0:2', ## aws bedrock llm model name - using Mistral for the demo model_kwargs={ "max_tokens_to_sample": 300, ## Max tokens as response "temperature": 0.1, ## randomness from LLM "top_p": 0.9 ## randomness from LLM } ) return llm

def rag_response(index,question): ## param1 - Index of the vector, param2 - question from the front end ragllm=rag_llm() rag_query=index.query(question=question,llm=ragllm) return rag_query

response_content = rag_response(rag_index(), question="what is Underlying earnings per share") print(response_content)

Expected behavior

It should run and vectorize the pdf file and user input, should send both to LLM and then return result

Who can help?

@italojohnny @ogabrielluiz @nicoloboschi @zzzming @jordanrfrazier @mfortman11 @NadirJ

please help to over come this error, I have to show a demo to my clients

Operating System

Windows 10

Langflow Version

1.0

Python Version

3.11

Screenshot

No response

Flow File

No response

selvabharathG avatar Aug 03 '24 10:08 selvabharathG

To resolve the ValidationException error when calling the InvokeModel operation in your Langchain-based AI application using AWS Bedrock, ensure that all required parameters are correctly set and valid. Specifically, check the following:

  1. Model ID: Ensure that the model_id you are using is one of the valid options. The valid options include:

    • "ai21.j2-grande-instruct"
    • "ai21.j2-jumbo-instruct"
    • "ai21.j2-mid"
    • "ai21.j2-mid-v1"
    • "ai21.j2-ultra"
    • "ai21.j2-ultra-v1"
    • "anthropic.claude-instant-v1"
    • "anthropic.claude-v1"
    • "anthropic.claude-v2"
    • "cohere.command-text-v14"
    • "amazon.titan-embed-text-v1" (for embeddings)
    • "mistral.mistral-7b-instruct-v0:2" (for LLM)
  2. Credentials Profile Name: Ensure that the credentials_profile_name is correctly set and that the AWS credentials are properly configured.

  3. Region Name: Ensure that the region_name is valid and supported by AWS Bedrock. For example, us-west-2.

  4. Model Kwargs: Ensure that any additional parameters in model_kwargs are correctly formatted and valid.

  5. Endpoint URL: Ensure that the endpoint_url is correct and accessible.

Here is the relevant code snippet for setting up the AmazonBedrockComponent model:

def build(
        self,
        model_id: str = "anthropic.claude-instant-v1",
        credentials_profile_name: Optional[str] = None,
        region_name: Optional[str] = None,
        model_kwargs: Optional[dict] = None,
        endpoint_url: Optional[str] = None,
        streaming: bool = False,
        cache: Optional[bool] = None,
    ) -> LanguageModel:
        try:
            output = Bedrock(
                credentials_profile_name=credentials_profile_name,
                model_id=model_id,
                region_name=region_name,
                model_kwargs=model_kwargs,
                endpoint_url=endpoint_url,
                streaming=streaming,
                cache=cache,
            )  # type: ignore
        except Exception as e:
            raise ValueError("Could not connect to AmazonBedrock API.") from e
        return output

Ensure that all these parameters are correctly set when you instantiate the AmazonBedrockComponent model. If any of these parameters are incorrect or missing, it could lead to a ValidationException [1][2][3].

To continue talking to Dosu, mention @dosu.

dosubot[bot] avatar Aug 03 '24 10:08 dosubot[bot]

@selvabharathG

Are you trying to run this in Langflow? I didn't quite get it.

In Langflow, there's a File component that reads PDF files.

image

carlosrcoelho avatar Aug 05 '24 19:08 carlosrcoelho

@selvabharathG

Thank you for your contribution! This issue will be closed. If you have any questions or encounter another problem, please open a new issue and we will be ready to assist you.

carlosrcoelho avatar Aug 07 '24 18:08 carlosrcoelho