langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

Feat/dynamic max tokens

Open evad1n opened this issue 1 year ago • 0 comments

References https://github.com/hwchase17/langchain/blob/master/langchain/llms/openai.py#L321 to implement the missing logic described in the comment. Passing -1 will calculate the max tokens dynamically.

Only done for the openai basic LLM. (the new chat one does it for you 😄 )

There may be a problem with strongly typing the model name as a TiktokenModel as it would not let other models to be passed in if the API supports new ones but the type isn't updated. Therefore I cast it.

evad1n avatar Mar 01 '23 23:03 evad1n