Bedrock Support
I am working on this project, and I want to use bedrock as I my chat service of choice, I tested this library with openai chatgpt-4, and it works perfectly.
Here is the code I am testing to connect to the aws bedrock api.
Mix.install([
{:langchain, "~> 0.3.0-rc.0"},
{:kino, "~> 0.12.0"}
])
# Set the OpenAI API key
Application.put_env(
:langchain,
:openai_key,
{MY_BEDROCK_API_KEY_GOES_HERE}
)
alias LangChain.Chains.LLMChain
alias LangChain.ChatModels.ChatOpenAI
alias LangChain.Message
alias LangChain.MessageDelta
defmodule Chatbot do
def start do
handler = %{
on_llm_new_delta: fn _model, %MessageDelta{} = data ->
IO.write(data.content)
end,
on_message_processed: fn _chain, %Message{} = data ->
IO.puts("")
IO.puts("")
IO.inspect(data.content, label: "COMPLETED MESSAGE")
end
}
chain =
%{
llm:
ChatOpenAI.new!(%{
model: "amazon.titan-text-express-v1",
stream: true,
callbacks: [handler],
endpoint:
"https://bedrock-runtime.us-east-1.amazonaws.com/model/amazon.titan-text-express-v1/invoke"
}),
callbacks: [handler]
}
|> LLMChain.new!()
|> LLMChain.add_message(
Message.new_system!(
"You are a helpful assistant. Provide concise and accurate responses to user queries."
)
)
chat_loop(chain)
end
defp chat_loop(chain) do
user_input = IO.gets("You: ") |> String.trim()
if user_input == "exit" do
IO.puts("Chatbot: Goodbye!")
else
{:ok, updated_chain, _response} =
chain
|> LLMChain.add_message(Message.new_user!(user_input))
|> LLMChain.run()
chat_loop(updated_chain)
end
end
end
# Start the chatbot
Chatbot.start()
Can someone help me out, what am I doing wrong? or bedrock isn't supported yet?
Hi @SashaDon! I haven't tried bedrock. What error are you getting?
Trying to process an unexpected response. %{"message" => "'{MY_BEDROCK_API_KEY_GOES_HERE}' not a valid key=value pair (missing equal-sign) in Authorization header: 'Bearer {MY_BEDROCK_API_KEY_GOES_HERE}'."}
10:52:05.771 [error] Error during chat call. Reason: "Unexpected response" ** (MatchError) no match of right hand side value: {:error, %LangChain.Chains.LLMChain{llm: %LangChain.ChatModels.ChatOpenAI{endpoint: "https://bedrock-runtime.us-east-1.amazonaws.com/model/amazon.titan-text-express-v1/invoke", model: "amazon.titan-text-express-v1", api_key: nil, temperature: 1.0, frequency_penalty: 0.0, receive_timeout: 60000, seed: nil, n: 1, json_response: false, stream: true, max_tokens: nil, stream_options: nil, callbacks: [%{on_llm_new_delta: #Function<0.19950590/2 in Chatbot.start/0>, on_message_processed: #Function<1.19950590/2 in Chatbot.start/0>}], user: nil}, verbose: false, verbose_deltas: false, tools: [], _tool_map: %{}, messages: [%LangChain.Message{content: "You are a helpful assistant. Provide concise and accurate responses to user queries.", processed_content: nil, index: nil, status: :complete, role: :system, name: nil, tool_calls: [], tool_results: nil}, %LangChain.Message{content: "hw", processed_content: nil, index: nil, status: :complete, role: :user, name: nil, tool_calls: [], tool_results: nil}], custom_context: nil, message_processors: [], max_retry_count: 3, current_failure_count: 0, delta: nil, last_message: %LangChain.Message{content: "hw", processed_content: nil, index: nil, status: :complete, role: :user, name: nil, tool_calls: [], tool_results: nil}, needs_response: true, callbacks: [%{on_llm_new_delta: #Function<0.19950590/2 in Chatbot.start/0>, on_message_processed: #Function<1.19950590/2 in Chatbot.start/0>}]}, "Unexpected response"} ai_experiments/alex/test_bedrock_langchain2.exs:59: Chatbot.chat_loop/1 ai_experiments/alex/test_bedrock_langchain2.exs:70: (file)
My guess is they want the API key in the Authorization header formatted differently? Can you provide a link to the Bedrock docs for how the API key should be submitted?
I've opened a PR that adds Bedrock support to ChatAnthropic: https://github.com/brainlid/langchain/pull/154
@SashaDon it looks like you're trying to use ChatOpenAI with the titan models on AWS Bedrock. I had a quick look at the docs and it doesn't look like they have an OpenAI API compatible request/response type (and ChatOpenAI doesn't support Bedrock). I imagine a new TitanChatModel would be needed to support that model.
Thanks @stevehodgkiss for looking into this and answering it. Closing for now.