langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Expanding LLM output: Adding `logprob` and `finish_reason` to `llm.generate()` output

Open edjzhang opened this issue 2 years ago • 1 comments

Hi langchain team,

I'd like to help add logprobs and finish_reason to the openai generation output. Would it be best to build onto the existing generate method in the BaseOpenAI class and Generation schema object? https://github.com/hwchase17/langchain/pull/293#pullrequestreview-1212330436 references adding a new method, which I believe is the generate method, and I wanted to confirm. Thanks!

edjzhang avatar Feb 09 '23 18:02 edjzhang

i think these should already be in there? https://github.com/hwchase17/langchain/blob/b7747017d72eaeabfa65edb7eec413d3fd006ddb/langchain/llms/openai.py#L275

or did you mean something else?

hwchase17 avatar Feb 11 '23 06:02 hwchase17

Ah my bad I totally missed that!

I'm planning to use logprobs as part of some conditional chaining -- will open a PR if it looks like there's something generalizable out of it (after more carefully checking what's already in the codebase next time...)

edjzhang avatar Feb 14 '23 02:02 edjzhang