langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Issue: Update OpenAI model token mapping to reflect new API update 2023-06-13

Open eoriont opened this issue 1 year ago • 2 comments

Issue you'd like to raise.

The blog post here https://openai.com/blog/function-calling-and-other-api-updates

specifies

  • new 16k context version of gpt-3.5-turbo (vs the standard 4k version)

The langchain/llms/openai.py model_token_mapping should be changed to reflect this.

Suggestion:

Add gpt-3.5-turbo-16k property to model_token_mapping with value 16k

eoriont avatar Jun 13 '23 21:06 eoriont

Yes, this is a very relevant pull request for all the people trying to update their code in a plug-and-play fashion

rafaelmbsouza avatar Jun 14 '23 00:06 rafaelmbsouza

image My attempt to add 16k models with model_token_mapping did not affect the token_max value when calling the map_reduce chain

llmadd avatar Jun 15 '23 06:06 llmadd

The combine_docs function on the MapReduceDocumentsChain has the following header:

    def combine_docs(
        self,
        docs: List[Document],
        token_max: int = 3000,
        callbacks: Callbacks = None,
        **kwargs: Any,
    ) -> Tuple[str, dict]:

The token_max parameter has a default value, which it looks like you're supposed to change depending on your model. Lmk if this helps!

eoriont avatar Jun 17 '23 03:06 eoriont

You were right,I am now fixing the problem temporarily by changing the value of token_max,But token_max need to change the value depending on the model

llmadd avatar Jun 19 '23 03:06 llmadd