requests
requests copied to clipboard
Add os.PathLike support for cert files
Summary
Adds support for any path-like object, including pathlib.Path. See https://github.com/psf/requests/issues/5936#issuecomment-931758148 for more details.
Some things have changed since the linked issue was created, so it may be a good time to add support now.
Fixes #5936
Simply resync your fork. You should be able to see that now - https://github.com/FlowiseAI/Flowise/pull/158
@HenryHengZJ your a life saver man!!!.. and i am still using render.com to run your code can you have any documentation for deploy in aws?
will work on the docs soon!
Realize this is a little late - but does anyone know how to achieve this in langchainjs? I've been using flowiseai to get my feet wet, but now I need to replicate the same chatflow with this feature separately.
I found this systemMessagePrompt: but I'm not sure how to implement it:
Flowise/packages/components/nodes/chains/ConversationalRetrievalQAChain /ConversationalRetrievalQAChain.ts
Trying to make it work here:
// Create a system message
const systemMessage = "I want you to act as a document that I am having a conversation with. Your name is Enigma. You will provide me with answers from the given info. If the answer is not included, search for an answer and return it, and make a note that you searched outside of the given info. Never break character."
// Create a chain that uses the OpenAI LLM and Pinecone vector store.
const chain = ConversationalRetrievalQAChain.fromLLM(
chat,
vectorStore.asRetriever(),
{
memory: new BufferMemory({
// humanPrefix: "I want you to act as a document that I am having a conversation with. You will provide me with answers from the given info. If the answer is not included, search for an answer and return it. Never break character.",
// humanPrefix: "Search the document index for relevant information pertaining to the user's question. Once the initial data is gathered, refer to the internal knowledge base (memory) to complete the response.",
memoryKey: "chat_history",
inputKey: "question",
returnMessages: true
}),
returnSourceDocuments: false,
verbose: false,
questionGeneratorChainOptions: {
template: systemMessage // pass systemMessage as context for CRQAChain
}
},
);
// call chain with latest user input from messages array, catch errors
chain
.call({ question: json.messages[json.messages.length - 1].content })
.catch(console.error)