langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

Max token exceed error not handled for anthropic models

Open safi7 opened this issue 1 year ago • 2 comments

Checked other resources

  • [X] I added a very descriptive title to this issue.
  • [X] I searched the LangChain.js documentation with the integrated search.
  • [X] I used the GitHub search to find a similar question and didn't find it.
  • [X] I am sure that this is a bug in LangChain.js rather than my code.
  • [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Error response for large content:

The system encountered an unexpected error during processing. Try your request again. at BedrockChat._streamResponseChunks (D:\development\xxx\dddd-api\node_modules@langchain\community\dist\chat_models\bedrock\web.cjs:530:31) at processTicksAndRejections (node:internal/process/task_queues:95:5)

Error Message and Stack Trace (if applicable)

The system encountered an unexpected error during processing. Try your request again. at BedrockChat._streamResponseChunks (D:\development\xxx\dddd-api\node_modules@langchain\community\dist\chat_models\bedrock\web.cjs:530:31) at processTicksAndRejections (node:internal/process/task_queues:95:5)

Description

Hello,

I have been using langchain js since a while. I use BedrockChat class to interact with bedrock models such as llama, mistral, sonnet 3.5 and opus. Recently, I implemented error handling for max token exceeded however, it is only working for llama and mistral now, anthropic models such as sonnet and opus returns general error. I would like to know what is the reason and how to fix. I did update the npm package and stuff. I guess there might be updates from bedrock or anthropic.

System Info

using latest @langchain/community in latest nodejs project

safi7 avatar Jul 11 '24 06:07 safi7

Hey @safi7! I'm here to help you with any issues you encounter with LangChain.js. I can assist you with bug fixes, answer your questions, or guide you in becoming a contributor. Let me know how I can assist you!

To handle the "max token exceed" error for anthropic models like sonnet and opus in LangChain.js, you need to implement specific error handling for these models. Here's how you can modify the completionWithRetry method to handle this error:

protected async completionWithRetry(
  request: AnthropicMessageCreateParams & Kwargs,
  options: AnthropicRequestOptions
): Promise<Anthropic.Message> {
  if (!this.apiKey) {
    throw new Error("Missing Anthropic API key.");
  }
  if (!this.batchClient) {
    const options = this.apiUrl ? { baseURL: this.apiUrl } : undefined;
    this.batchClient = new Anthropic({
      ...this.clientOptions,
      ...options,
      apiKey: this.apiKey,
      maxRetries: 0,
    });
  }
  const makeCompletionRequest = async () => {
    try {
      return await this.batchClient.messages.create(
        {
          ...request,
          ...this.invocationKwargs,
        } as AnthropicMessageCreateParams,
        options
      );
    } catch (error) {
      if (error.response && error.response.data && error.response.data.error) {
        const errorMessage = error.response.data.error.message;
        if (errorMessage.includes("max tokens")) {
          throw new Error("Max token limit exceeded for Anthropic model.");
        }
      }
      throw error;
    }
  };
  return this.caller.callWithOptions(
    { signal: options.signal ?? undefined },
    makeCompletionRequest
  );
}

In this code, the makeCompletionRequest function is wrapped in a try-catch block. If the error response from the Anthropic API contains a message indicating that the maximum token limit has been exceeded, a specific error is thrown. Otherwise, the original error is re-thrown. This way, you can handle the "max token exceed" error specifically for anthropic models [1][2].

To continue talking to Dosu, mention @dosu.

dosubot[bot] avatar Jul 11 '24 06:07 dosubot[bot]

Thanks for flagging - will try to look as soon as I can.

jacoblee93 avatar Jul 12 '24 22:07 jacoblee93