langchain
                                
                                 langchain copied to clipboard
                                
                                    langchain copied to clipboard
                            
                            
                            
                        Retrieval chain is giving response in some words only why?
I am using langchain==0.1.1 and openai==1.8.0 and using all latest module of langchain.
🤖
Based on the information you've provided, it seems like the issue might be related to the max_tokens_limit parameter in the ConversationalRetrievalChain class. This parameter is used to ensure that the documents returned are less than this limit. This is only enforced if combine_docs_chain is of type StuffDocumentsChain. If the max_tokens_limit is set to a small value, it could result in the retrieval chain returning a response in a few words.
Here is the relevant code:
def _reduce_tokens_below_limit(self, docs: List[Document]) -> List[Document]:
    num_docs = len(docs)
    if self.max_tokens_limit and isinstance(
        self.combine_docs_chain, StuffDocumentsChain
    ):
        tokens = [
            self.combine_docs_chain.llm_chain._get_num_tokens(doc.page_content)
            for doc in docs
        ]
        token_count = sum(tokens[:num_docs])
        while token_count > self.max_tokens_limit:
            num_docs -= 1
            token_count -= tokens[num_docs]
    return docs[:num_docs]
In this method, if the total token count of the documents exceeds the max_tokens_limit, it reduces the number of documents until the total token count is below the limit. This could result in shorter responses if the limit is set too low.
To resolve this issue, you could try increasing the max_tokens_limit or check if the StuffDocumentsChain is correctly configured. You can find this code in the libs/langchain/langchain/chains/conversational_retrieval/base.py file.
Also, it's worth noting that the default value of max_tokens_limit in the ConversationalRetrievalChain class in LangChain v0.1.1 is None, which means there is no limit by default. If you haven't set this parameter, it might not be the cause of your issue.
If this doesn't solve your problem, could you please provide more details about your configuration and the exact steps you're taking that lead to this issue? This will help me better understand the problem and provide a more accurate solution.
Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.