Ziyad Moraished
Results
3
comments of
Ziyad Moraished
@wen020 There seems to be `max_tokens_limit` parameter, but that is not reflected in the documentation. This fixed my issue: `ConversationalRetrievalChain.from_llm(OpenAI(temperature=0), vectorstore.as_retriever(search_kwargs={"k": 1}), max_tokens_limit=4097)`
I'm having the same issue with Arabic letters!
I downloaded it by running the following `brew install md5sha1sum `