langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

Ideas for `PromptTemplate` composition and type safety improvements

Open davidfant opened this issue 10 months ago • 3 comments

This is a draft PR of some core improvements to typing of prompt templates. @functorism and I went down a rabbit hole trying to make prompt template input types inferred.

Background

I was trying to improve type safety when composing multiple prompts. For example I wanted to do something like:

const a = PromptTemplate.fromTemplate('hello {a}'); // infers RunInput to { a: string }
const b = HandlebarsPromptTemplate.fromTemplate<{ b: number }>('hello {{b}}'); // explicitly sets RunInput to { b: number }

const composable = ComposablePromptTemplate.compose<{ c: string }>(HandlebarsPromptTemplate)`
    ${a}
    ${b}
    hello {{c}}
`;

// format type is an object of shape `{ a: string; b: number; c: string }`
const formatted = await composable.format({ a: 'America', b: 'Belgium', c: 'Croatia' }); 

Together w @functorism we created a class ComposablePromptTemplate that does this inference, which I'll gladly share later. Here's a simplifed version that paints the picture:

const composeTemplateString = <T>() => <Prompts extends PromptTemplate<unknown>[]>(
  strings: TemplateStringsArray,
  ...prompts: Prompts
): StringWithType<T & ObjectTupleIntersection<{}, { [K in keyof Prompts]: Prompts[K] extends PromptTemplate<infer T> ? T : never }>> => {
  return strings.map((str, i) => str + prompts[i].toString()).join('') as any;
}

Problem

The problem that I run into when using ComposablePromptTemplate is that ChatPromptTemplate and SystemPromptTemplate don't propagate inferred type parameters correctly. Similar lack of propagation has been acknowledge in other parts in this issue for example https://github.com/langchain-ai/langchainjs/issues/4155

Core parts of this PR

  1. ChatPromptTemplate.fromTypedMessages is a new better typed version of ChatPromptTemplate.fromMessages that infers the ChatPromptTemplate RunInput based on the combined type of the messages' RunInputs. We couldn't update the existing fromMessages function without breaking current usage, and therefore added this as a new function.
  2. new {System/Human/AI}PromptTemplate(prompt) infers RunInput type from the child prompt
  3. We got very confused about TS errors related to RunInput defaulting to Symbol in PromptTemplate, which is used to infer if the PromptTemplate's type should be inferred using fstring generics or the explicit type, and tried to make that more explicit using InputValues_FSTRING. This is a vague suggestion and it might make sense for e.g. @jacoblee93 and @functorism to chat and share thoughts on typings

Disclaimer

Do note, this is very much a showcase PR. E.g. InputValues_FSTRING should probably not be in the file it's currently in.

davidfant avatar Apr 19 '24 21:04 davidfant

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchainjs-api-refs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 19, 2024 9:56pm
langchainjs-docs ❌ Failed (Inspect) Apr 19, 2024 9:56pm

vercel[bot] avatar Apr 19 '24 21:04 vercel[bot]

Sorry for the delay - this one obviously needs more attention, will look tomorrow!

jacoblee93 avatar Apr 22 '24 21:04 jacoblee93

Heading out for a vacation but tagging in @nfcampos for a look!

jacoblee93 avatar Apr 26 '24 23:04 jacoblee93

Going to close this PR due to merge conflicts. Please re-open off of main if you'd like to get this merged in! Thank you.

bracesproul avatar Aug 28 '24 18:08 bracesproul