langchainjs
langchainjs copied to clipboard
Feat/dynamic max tokens
References https://github.com/hwchase17/langchain/blob/master/langchain/llms/openai.py#L321 to implement the missing logic described in the comment. Passing -1 will calculate the max tokens dynamically.
Only done for the openai basic LLM. (the new chat one does it for you 😄 )
There may be a problem with strongly typing the model name as a TiktokenModel
as it would not let other models to be passed in if the API supports new ones but the type isn't updated. Therefore I cast it.