requests icon indicating copy to clipboard operation
requests copied to clipboard

Add os.PathLike support for cert files

Open steveberdy opened this issue 2 years ago • 3 comments
trafficstars

Summary

Adds support for any path-like object, including pathlib.Path. See https://github.com/psf/requests/issues/5936#issuecomment-931758148 for more details. Some things have changed since the linked issue was created, so it may be a good time to add support now.

Fixes #5936

steveberdy avatar May 11 '23 02:05 steveberdy

Simply resync your fork. You should be able to see that now - https://github.com/FlowiseAI/Flowise/pull/158

HenryHengZJ avatar May 25 '23 12:05 HenryHengZJ

@HenryHengZJ your a life saver man!!!.. and i am still using render.com to run your code can you have any documentation for deploy in aws?

mahaboobkhan29 avatar May 26 '23 06:05 mahaboobkhan29

will work on the docs soon!

HenryHengZJ avatar May 26 '23 09:05 HenryHengZJ

Realize this is a little late - but does anyone know how to achieve this in langchainjs? I've been using flowiseai to get my feet wet, but now I need to replicate the same chatflow with this feature separately.

I found this systemMessagePrompt: but I'm not sure how to implement it:

Flowise/packages/components/nodes/chains/ConversationalRetrievalQAChain /ConversationalRetrievalQAChain.ts

Trying to make it work here:

 // Create a system message
 const systemMessage = "I want you to act as a document that I am having a conversation with. Your name is Enigma. You will provide me with answers from the given info. If the answer is not included, search for an answer and return it, and make a note that you searched outside of the given info. Never break character."

  // Create a chain that uses the OpenAI LLM and Pinecone vector store.
  const chain =  ConversationalRetrievalQAChain.fromLLM(
    chat, 
    vectorStore.asRetriever(),
    {
      memory: new BufferMemory({
        // humanPrefix: "I want you to act as a document that I am having a conversation with. You will provide me with answers from the given info. If the answer is not included, search for an answer and return it. Never break character.",
        // humanPrefix: "Search the document index for relevant information pertaining to the user's question. Once the initial data is gathered, refer to the internal knowledge base (memory) to complete the response.",
        memoryKey: "chat_history",
        inputKey: "question",
        returnMessages: true
      }),
      returnSourceDocuments: false,
      verbose: false,
      questionGeneratorChainOptions: {
        template: systemMessage // pass systemMessage as context for CRQAChain
      }
    },
    
    );

  // call chain with latest user input from messages array, catch errors
  chain
    .call({ question: json.messages[json.messages.length - 1].content })
    .catch(console.error)

ianmcfall avatar Jul 02 '23 17:07 ianmcfall