langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Allow returning the final prompts that are sent to LLMs

Open wu375 opened this issue 1 year ago • 1 comments

Problem

Prompts are formatted and merged (e.g. adding ---context---\n) before being sent to LLMs. The final prompts sent to LLMs can vary depending on the type of chains used and can be quite different from the original inputs to the chains. However, the user cannot see such differences directly, making debugging more difficult.

Changes

  • Allow returning the final prompts as part of LLMResult.
  • Final prompts here are collected before being sent to specific API vendors (e.g. OpenAI). Therefore the addition to LLMResult not ChatResult.
  • Final prompts are returned with the _call() method of LLMChain in /chains/llm.py.

Limitations

  • This is not yet implemented on all usage of LLMResult (i.e. only implemented in /chat_models not /llms)
  • Due to the use of predict() instead of _call() in most of the chains, final prompts will not be returned there.
  • Async versions are not implemented
  • If you like the idea, I can fix the above limitations

Who can review?

Tag maintainers/contributors who might be interested: @hwchase17 @agola11

wu375 avatar Jun 16 '23 02:06 wu375

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ❌ Failed (Inspect) Jun 16, 2023 2:58am

vercel[bot] avatar Jun 16 '23 02:06 vercel[bot]

@wu375 is attempting to deploy a commit to the LangChain Team on Vercel.

A member of the Team first needs to authorize it.

vercel[bot] avatar Jun 20 '23 05:06 vercel[bot]

Can I get some feedback so that I can decide whether to keep working on this feature or not? Thank you so much! @hwchase17 @agola11

wu375 avatar Jun 20 '23 05:06 wu375

I believe this is implemented pretty well by callbacks, or even easier in LangSmith!

Closing because the PR wouldn't line up with the current directory structure of the library (would need to be in /libs/langchain/langchain instead of /langchain). Feel free to reopen against the current head if it's still relevant!

efriis avatar Nov 07 '23 04:11 efriis