Incompatible type for `.withStructuredOutput` between different LLM provider
Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain.js documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangChain.js rather than my code.
- [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
I generate the langchain LLM object conditionally like this:
export function getLangChainLlm(m?: Models, p?: ProvidersType) {
const { model, provider } = getLlmModelAndProvider(m, p);
switch (provider) {
case Providers.Enum.groq:
return new ChatGroq({
model,
temperature: 0,
maxRetries: 2,
});
case Providers.Enum.openai:
return new ChatOpenAI({
model,
temperature: 0,
maxRetries: 2,
});
default:
throw new Error(`Unknown provider: ${provider}`);
}
}
Error Message and Stack Trace (if applicable)
const llm = getLangChainLlm()
While trying to use llm.withStructuredOutput(SomeZodSchema), I get this type error
This expression is not callable.
Each member of the union type '{ <RunOutput extends Record<string, any> = Record<string, any>>(outputSchema: Record<string, any> | StructuredOutputMethodParams<RunOutput, false> | ZodType<...>, config?: ChatOpenAIStructuredOutputMethodOptions<...> | undefined): Runnable<...>; <RunOutput extends Record<string, any> = Record<...>>(outputSchema: Rec...' has signatures, but none of those signatures are compatible with each other.ts(2349)
Description
I think the langchain doc says it provides an unified interface for withStructuredOutput accross different LLMs. In that case the type should be compatible ?
System Info
[email protected] | MIT | deps: 12 | versions: 293
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/
keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores
dist
.tarball: https://registry.npmjs.org/langchain/-/langchain-0.3.0.tgz
.shasum: 006db05e5c774db83725194336c01e01bcace703
.integrity: sha512-gvB8ikvCxL6KWS0HN89gdHMDNi9gUKz3MRMU6/TqC86T0v9Bvg2mIS3pF1qBaoC3/SXG404+uUduOiX7BOXmJQ==
.unpackedSize: 2.9 MB
dependencies:
@langchain/openai: >=0.1.0 <0.4.0 js-tiktoken: ^1.0.12 jsonpointer: ^5.0.1 openapi-types: ^12.1.3 uuid: ^10.0.0 zod-to-json-schema: ^3.22.3
@langchain/textsplitters: >=0.0.0 <0.2.0 js-yaml: ^4.1.0 langsmith: ~0.1.40 p-retry: 4 yaml: ^2.2.1 zod: ^3.22.4
maintainers:
- nfcampos <[email protected]>
- jacoblee93 <[email protected]>
- andrewnguonly <[email protected]>
- davidduong <[email protected]>
- hwchase17 <[email protected]>
- basproul <[email protected]>
dist-tags:
latest: 0.3.0 next: 0.3.0-rc.0
published 20 hours ago by jacoblee93 <[email protected]>
platform: Mac Node: 20.10.0 npm: 10.2.3
I found a similar closed issue that might be relevant to your problem: Zod and Enum support. This issue was resolved by updating to version 0.0.59 from 0.0.55 [1].
The type error you are encountering when using llm.withStructuredOutput(SomeZodSchema) with different LLM providers in LangChain.js is likely because the LLM you are using does not support the withStructuredOutput method. The withStructuredOutput method is designed to work with LLMs that support structured output. It checks if the LLM object has a withStructuredOutput method and throws an error if it does not:
if (typeof llm.withStructuredOutput !== "function") {
throw new Error(
"The specified LLM does not support the 'withStructuredOutput'. Please ensure you are using an LLM that supports this feature."
);
}
Additionally, the withStructuredOutput method has specific requirements for the LLM to support structured output, such as binding tools and response formats. For example, in the case of GoogleAI, the method binds tools and sets the response format to json_object if the method is in jsonMode:
if (method === "jsonMode") {
throw new Error(`Google only supports "functionCalling" as a method.`);
}
Ensure that the LLM provider you are using supports the withStructuredOutput method and meets the necessary requirements for structured output. For instance, GoogleAI only supports the "functionCalling" method and not "jsonMode" [2].
In LangChain.js, the BaseLanguageModel class includes the withStructuredOutput method, which is designed to return outputs formatted to match a given schema. This method can be used as follows:
withStructuredOutput?<
RunOutput extends Record<string, any> = Record<string, any>
>(
schema: z.ZodType<RunOutput> | Record<string, any>,
config?: StructuredOutputMethodOptions<boolean>
): Runnable<BaseLanguageModelInput, RunOutput> | Runnable<BaseLanguageModelInput, { raw: BaseMessage; parsed: RunOutput }>;
Make sure the LLM you are using extends from BaseLanguageModel and implements the withStructuredOutput method correctly. For example, the ChatOpenAI class in LangChain.js supports structured output and includes the withStructuredOutput method [3].
The ChatGroq class, however, extends from BaseChatModel, not from BaseLanguageModel or any other class that implements the withStructuredOutput method [4]. This could be the reason for the type incompatibility you are experiencing.
To resolve this issue, ensure that the LLM you are using is compatible with the withStructuredOutput method and meets the necessary requirements for structured output.
To continue talking to Dosu, mention @dosu.
@dosu
I am using an LLM which compatible with withStructuredOutput . This problem is with TypeScript types only.
If I just use any of the LLM either groq or OpenAI it works fine. Only creates type problem when the llm type is either group or openai
To resolve the TypeScript type incompatibility issue when using withStructuredOutput with either ChatGroq or ChatOpenAI in LangChain.js, you need to ensure that the outputSchema parameter and the config parameter are correctly typed and passed. The withStructuredOutput method has multiple overloads, and the correct one should be used based on whether includeRaw is true or boolean.
Here is the relevant code snippet for the withStructuredOutput method:
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<false>
): Runnable<BaseLanguageModelInput, RunOutput>;
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<true>
): Runnable<BaseLanguageModelInput, { raw: BaseMessage; parsed: RunOutput }>;
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<boolean>
):
| Runnable<BaseLanguageModelInput, RunOutput>
| Runnable<
BaseLanguageModelInput,
{ raw: BaseMessage; parsed: RunOutput }
> {
const schema: z.ZodType<RunOutput> | Record<string, any> = outputSchema;
const name = config?.name;
const description = schema.description ?? "A function available to call.";
const method = config?.method;
const includeRaw = config?.includeRaw;
if (method === "jsonMode") {
throw new Error(`ChatBedrockConverse does not support 'jsonMode'.`);
}
// Additional implementation details...
}
Ensure that the outputSchema and config parameters are correctly typed and passed to the withStructuredOutput method. This will help resolve the type incompatibility issue. Additionally, make sure to follow the setup and usage instructions for ChatGroq and ChatOpenAI as provided in the LangChain.js documentation [1][2].
If you are conditionally generating the LLM object, you might need to use a type assertion or a type guard to ensure TypeScript understands that the returned LLM object supports the withStructuredOutput method. Here is an example of how you might do this:
export function getLangChainLlm(m?: Models, p?: ProvidersType): ChatGroq | ChatOpenAI {
const { model, provider } = getLlmModelAndProvider(m, p);
switch (provider) {
case Providers.Enum.groq:
return new ChatGroq({
model,
temperature: 0,
maxRetries: 2,
});
case Providers.Enum.openai:
return new ChatOpenAI({
model,
temperature: 0,
maxRetries: 2,
});
default:
throw new Error(`Unknown provider: ${provider}`);
}
}
const llm = getLangChainLlm();
if ('withStructuredOutput' in llm) {
llm.withStructuredOutput(SomeZodSchema);
}
This approach ensures that TypeScript recognizes that the llm object has the withStructuredOutput method, resolving the type incompatibility issue.
To continue talking to Dosu, mention @dosu.
@dosu I think this is a bug and needs human intervention
This seems like it could be a bug that needs your attention, @jacoblee93, could you take a look?
To continue talking to Dosu, mention @dosu.
Hey! Sorry you're running into this - can you confirm you've got the latest versions of @langchain/groq and OpenAI?
We just pushed 0.3.0 of core yesterday, updated docs will be coming soon but you'll need the latest for all packages to use it.
Hi @jacoblee93
Thanks for your quick reply.
Yes I just updated to 0.3.0, before creating this issue,
Here is how my package.json looks like
"@langchain/community": "^0.3.0",
"@langchain/core": "^0.3.0",
"@langchain/groq": "^0.1.1",
"@langchain/openai": "^0.3.0",
I checked the type def of both the packages:
https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-openai/src/chat_models.ts#L1716 https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-groq/src/chat_models.ts#L967
The only difference that I see is, the OpenAI one has an extra type in the union which is StructuredOutputMethodParams<RunOutput, false>
Got it - will have a look but OpenAI does accept a few unique args so it may be expected
Thanks @jacoblee93
In that case can we mention the types in a way so that that the TS type inference works correctly, this more seems to be a problem where TS is not able to infer the the type properly.
Hi, @shirshendubhowmick. I'm Dosu, and I'm helping the LangChain JS team manage their backlog. I'm marking this issue as stale.
Issue Summary:
- You reported a TypeScript type incompatibility with the
.withStructuredOutputmethod in LangChain.js. - The issue occurs when using
ChatGroqandChatOpenAILLMs together, causing a type error. - Suggestions were made to ensure correct typing and use type assertions.
- @jacoblee93 acknowledged the issue and is investigating further.
- You updated to the latest package versions but still face the issue, indicating a need for better TypeScript type inference.
Next Steps:
- Please confirm if this issue is still relevant with the latest version of LangChain JS. If it is, feel free to comment to keep the discussion open.
- If there is no further activity, this issue will be automatically closed in 7 days.
Thank you for your understanding and contribution!
I can confirm with latest version this is fixed
Thank you for closing the issue, @shirshendubhowmick! We appreciate your help in keeping the repository organized.
Thank you for closing the issue, @shirshendubhowmick! We appreciate your help in keeping the repository organized.