botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: Validation Error
Bug Description
I am trying to Build an AI application with RAG enabled using Langchain framework. below is the source code of my application and I am getting error when running it, no compilation error.
Error message
PS C:\Users\Selvabharath>
> & C:/Users/Selvabharath/AppData/Local/Programs/Python/Python311/python.exe rag_backend._without_frontpy.py
a Traceback (most recent call last): File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain_aws\llms\bedrock.py", line 690, in _prepare_input_and_invoke response = self.client.invoke_model(**request_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\botocore\client.py", line 565, in _api_call return self._make_api_call(operation_name, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Selvabharath\AppData\Local\Programs\Python\Python311\Lib\site-packages\botocore\client.py", line 1017, in _make_api_call raise error_class(parsed_response, operation_name) botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: Validation Error
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\Users\Selvabharath\Documents\Selva\9_Work\Python_Projects\Demo_Sean\rag_backend._without_frontpy.py", line 45, in
Reproduction
import os from langchain_community.document_loaders import PyPDFLoader from langchain.text_splitter import RecursiveCharacterTextSplitter from langchain_community.embeddings import BedrockEmbeddings from langchain_community.vectorstores import FAISS from langchain.indexes import VectorstoreIndexCreator #from langchain_community.llms import Bedrock from langchain_aws import BedrockLLM print("a")
def rag_index(): data_load=PyPDFLoader('C:/Users/Selvabharath/Documents/Selva/9_Work/AI_with_RAG/Data Source/sample_input.pdf') ## location of the PDF file data_split=RecursiveCharacterTextSplitter(separators=["\n\n", "\n", " ", ""], chunk_size=100, chunk_overlap=10) ## Defining chunks
data_embeddings=BedrockEmbeddings(
credentials_profile_name='default', ## aws profile name
model_id = 'amazon.titan-embed-text-v1') ## what embeddings to use from Bedrock
data_index=VectorstoreIndexCreator(
text_splitter=data_split, ## data split
embedding=data_embeddings, ## embedding
vectorstore_cls=FAISS ## vector store
) ##--> Index is created
db_index=data_index.from_loaders([data_load]) ## input is passed to create index
return db_index
def rag_llm(): llm=BedrockLLM( credentials_profile_name='default', ## aws profile name model_id = 'mistral.mistral-7b-instruct-v0:2', ## aws bedrock llm model name - using Mistral for the demo model_kwargs={ "max_tokens_to_sample": 300, ## Max tokens as response "temperature": 0.1, ## randomness from LLM "top_p": 0.9 ## randomness from LLM } ) return llm
def rag_response(index,question): ## param1 - Index of the vector, param2 - question from the front end ragllm=rag_llm() rag_query=index.query(question=question,llm=ragllm) return rag_query
response_content = rag_response(rag_index(), question="what is Underlying earnings per share") print(response_content)
Expected behavior
It should run and vectorize the pdf file and user input, should send both to LLM and then return result
Who can help?
@italojohnny @ogabrielluiz @nicoloboschi @zzzming @jordanrfrazier @mfortman11 @NadirJ
please help to over come this error, I have to show a demo to my clients
Operating System
Windows 10
Langflow Version
1.0
Python Version
3.11
Screenshot
No response
Flow File
No response
To resolve the ValidationException error when calling the InvokeModel operation in your Langchain-based AI application using AWS Bedrock, ensure that all required parameters are correctly set and valid. Specifically, check the following:
-
Model ID: Ensure that the
model_idyou are using is one of the valid options. The valid options include:- "ai21.j2-grande-instruct"
- "ai21.j2-jumbo-instruct"
- "ai21.j2-mid"
- "ai21.j2-mid-v1"
- "ai21.j2-ultra"
- "ai21.j2-ultra-v1"
- "anthropic.claude-instant-v1"
- "anthropic.claude-v1"
- "anthropic.claude-v2"
- "cohere.command-text-v14"
- "amazon.titan-embed-text-v1" (for embeddings)
- "mistral.mistral-7b-instruct-v0:2" (for LLM)
-
Credentials Profile Name: Ensure that the
credentials_profile_nameis correctly set and that the AWS credentials are properly configured. -
Region Name: Ensure that the
region_nameis valid and supported by AWS Bedrock. For example,us-west-2. -
Model Kwargs: Ensure that any additional parameters in
model_kwargsare correctly formatted and valid. -
Endpoint URL: Ensure that the
endpoint_urlis correct and accessible.
Here is the relevant code snippet for setting up the AmazonBedrockComponent model:
def build(
self,
model_id: str = "anthropic.claude-instant-v1",
credentials_profile_name: Optional[str] = None,
region_name: Optional[str] = None,
model_kwargs: Optional[dict] = None,
endpoint_url: Optional[str] = None,
streaming: bool = False,
cache: Optional[bool] = None,
) -> LanguageModel:
try:
output = Bedrock(
credentials_profile_name=credentials_profile_name,
model_id=model_id,
region_name=region_name,
model_kwargs=model_kwargs,
endpoint_url=endpoint_url,
streaming=streaming,
cache=cache,
) # type: ignore
except Exception as e:
raise ValueError("Could not connect to AmazonBedrock API.") from e
return output
Ensure that all these parameters are correctly set when you instantiate the AmazonBedrockComponent model. If any of these parameters are incorrect or missing, it could lead to a ValidationException [1][2][3].
To continue talking to Dosu, mention @dosu.
@selvabharathG
Are you trying to run this in Langflow? I didn't quite get it.
In Langflow, there's a File component that reads PDF files.
@selvabharathG
Thank you for your contribution! This issue will be closed. If you have any questions or encounter another problem, please open a new issue and we will be ready to assist you.