langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Output score in LLMChain

Open einarbmag opened this issue 2 years ago • 1 comments

It would be great to get the score output of the LLM (e.g. using Huggingface models) for use cases like NLU. It doesn't look possible with the current LLM and chain classes as it specifically selects the "text" in the output only. I'm resorting to writing my own classes to allow this, but haven't thought much about how that would integrate with the rest of the framework. Is it something you have considered?

Edit: just to be clear, I'm talking about the logit scores directly from the HF model. The map-rerank example that asks the model to generate scores in the text output doesn't work very well.

einarbmag avatar Feb 15 '23 10:02 einarbmag

yes! the place to put this would probably be on the generation info dict (allows for arbitrary return types for different types of LLMs) https://github.com/hwchase17/langchain/blob/c60954d0f85fbb0971f54cbcb3497eb5fdd72baf/langchain/schema.py#L32

am happy to help with this if you have a working prototype i can port over

hwchase17 avatar Feb 16 '23 07:02 hwchase17

Hi @einarbmag @hwchase17 any news regarding this issue?

I am facing the same issue where the scores from the model are computed in the same time as text generation. However, the _call method of LLM class doesn't allow to return anything else than a str... So, how could I return both generated text and logits during generation?

Matthieu-Tinycoaching avatar Jun 08 '23 15:06 Matthieu-Tinycoaching

Hi, @einarbmag! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, the issue you raised was about requesting the ability to output the score of the LLM for use cases like NLU. It seems that the issue has been resolved by adding this functionality to the generation info dict. User hwchase17 offered to help with a prototype, and the functionality was successfully implemented.

Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your contribution to the LangChain repository!

dosubot[bot] avatar Sep 18 '23 16:09 dosubot[bot]