Emmanuel Irog-irog
Emmanuel Irog-irog
Hey guys maybe this could give us a clue. In flowiseai you can add a System prompt and other things to a ConversationalQA Chain 
Hey I've hadn't face this issue yet haha but my goal was to reduce the amount of documents retrieved by using compressions tried one of the first examples which used...
Tried using the embeddings_filter ``` embeddings_filter_1 = EmbeddingsFilter(embeddings=embeddings, similarity_threshold=0.76) compression_retriever_1 = ContextualCompressionRetriever(base_compressor=embeddings_filter_1, base_retriever=retriever_1) ``` It gave me an error of InvalidRequestError: This model's maximum context length is 16385 tokens. However,...
encountered this error when upgrading from v229 to v235 It works fine at v229 `chain = create_structured_output_chain(OrderID, llm=ChatOpenAI(temperature=0, model="gpt-3.5-turbo-16k"), prompt=prompt)` OUTPUT `{'order_id': 123456789}` when I run this it outputs a...