langchainjs
langchainjs copied to clipboard
Ideas for `PromptTemplate` composition and type safety improvements
This is a draft PR of some core improvements to typing of prompt templates. @functorism and I went down a rabbit hole trying to make prompt template input types inferred.
Background
I was trying to improve type safety when composing multiple prompts. For example I wanted to do something like:
const a = PromptTemplate.fromTemplate('hello {a}'); // infers RunInput to { a: string }
const b = HandlebarsPromptTemplate.fromTemplate<{ b: number }>('hello {{b}}'); // explicitly sets RunInput to { b: number }
const composable = ComposablePromptTemplate.compose<{ c: string }>(HandlebarsPromptTemplate)`
${a}
${b}
hello {{c}}
`;
// format type is an object of shape `{ a: string; b: number; c: string }`
const formatted = await composable.format({ a: 'America', b: 'Belgium', c: 'Croatia' });
Together w @functorism we created a class ComposablePromptTemplate
that does this inference, which I'll gladly share later. Here's a simplifed version that paints the picture:
const composeTemplateString = <T>() => <Prompts extends PromptTemplate<unknown>[]>(
strings: TemplateStringsArray,
...prompts: Prompts
): StringWithType<T & ObjectTupleIntersection<{}, { [K in keyof Prompts]: Prompts[K] extends PromptTemplate<infer T> ? T : never }>> => {
return strings.map((str, i) => str + prompts[i].toString()).join('') as any;
}
Problem
The problem that I run into when using ComposablePromptTemplate
is that ChatPromptTemplate
and SystemPromptTemplate
don't propagate inferred type parameters correctly. Similar lack of propagation has been acknowledge in other parts in this issue for example https://github.com/langchain-ai/langchainjs/issues/4155
Core parts of this PR
-
ChatPromptTemplate.fromTypedMessages
is a new better typed version ofChatPromptTemplate.fromMessages
that infers the ChatPromptTemplate RunInput based on the combined type of the messages' RunInputs. We couldn't update the existingfromMessages
function without breaking current usage, and therefore added this as a new function. -
new {System/Human/AI}PromptTemplate(prompt)
infers RunInput type from the child prompt - We got very confused about TS errors related to
RunInput
defaulting toSymbol
inPromptTemplate
, which is used to infer if the PromptTemplate's type should be inferred using fstring generics or the explicit type, and tried to make that more explicit usingInputValues_FSTRING
. This is a vague suggestion and it might make sense for e.g. @jacoblee93 and @functorism to chat and share thoughts on typings
Disclaimer
Do note, this is very much a showcase PR. E.g. InputValues_FSTRING
should probably not be in the file it's currently in.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
Name | Status | Preview | Comments | Updated (UTC) |
---|---|---|---|---|
langchainjs-api-refs | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Apr 19, 2024 9:56pm |
langchainjs-docs | ❌ Failed (Inspect) | Apr 19, 2024 9:56pm |
Sorry for the delay - this one obviously needs more attention, will look tomorrow!
Heading out for a vacation but tagging in @nfcampos for a look!
Going to close this PR due to merge conflicts. Please re-open off of main
if you'd like to get this merged in! Thank you.