paper-qa
paper-qa copied to clipboard
Docs() generates exception due to langchain AIMessage() not supplying __len__() when I specify a langchain llm
I've just noticed that if I specify a langchain llm, eg:
return Docs(llm = 'langchain', client = ChatOpenAI(), index_path = index_path)
and then try to index a document I get this exception:
Traceback (most recent call last):
File "/Users/mike/src/chatbot/./chatbot-multi-wrap", line 11842, in
which seems to be due to langchain using its own AIMessage class to encode messages, and not supplying a len() method. Is this my mistake? Or is it something that needs patching? Thanks!
Update: I'm using paper-qa 4.4.0 with langchain 0.1.13
I guess the root issue is that LLMModel.count_tokens() expects text to be a string, whereas it is (now, lately?) a langchain_core.messages.AIMessage, which contains a string (or union of dicts of strings). So the best approach might be to rewrite LLMModel to expect a string OR AIMessage and process it accordingly? I tried monkey-patching LLMModel to try this out but I got sucked into a maze of imports and methods that receive AIMessage instances (another is in Docs.aadd() where len(citation) similarly throws an exception).
Hello @maspotts, we have just released version 5, which completely removes LangChain from our stack and centers on https://github.com/BerriAI/litellm. So feel free to use LangChain as needed.
If your issue persists, please reopen a new issue using paper-qa>=5