langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Rendering non-chat flows

Open kencheah opened this issue 11 months ago • 3 comments

Hi Team, are there any plans to render the output of non-chat flows? Currently it seems the end node has to be some form of LLMChain in which the chat functionality can connect to. I currently have to export the json workflow into my local environment and execute there. Having this functionality on the frontend is possible would be great!

kencheah avatar Mar 04 '24 09:03 kencheah

Based on your request, it seems you want to execute non-chat workflows directly on the frontend without having to export the JSON workflow into your local environment. Here's a detailed guide on how you can achieve this:

  1. WebSocket Connection: The WebSocket connection established in the connectWS function is used to communicate with the backend in real-time. You can utilize this connection to send and receive messages related to non-chat workflows.

  2. Sending Workflow Execution Requests: You can modify the sendAll function or create a similar function dedicated to non-chat workflows. This function should format the request with the necessary workflow details and inputs, then send it through the WebSocket connection. The backend should be able to differentiate between chat and non-chat workflow requests.

  3. Receiving Workflow Execution Responses: In the handleWsMessage function, add logic to handle responses from the backend related to non-chat workflow executions. This could involve updating the UI with the results of the workflow execution or handling any errors that occurred.

  4. UI Integration: Integrate a UI component or modal that allows users to input the necessary details for executing a non-chat workflow. This could be a form that captures the workflow ID, input parameters, and any other necessary information. Upon submission, use the modified sendAll function to send the execution request to the backend.

  5. Backend Support: Ensure that your backend is capable of handling requests for non-chat workflow executions. This includes parsing the request, executing the specified workflow with the provided inputs, and sending back the results or any errors.

Here's a simplified example of how you might modify the sendAll function to handle non-chat workflow execution requests:

async function executeWorkflow(data: WorkflowExecutionProps): Promise<void> {
  try {
    if (ws.current) {
      const workflowExecutionRequest = {
        type: 'execute_workflow', // Indicate this is a workflow execution request
        workflowId: data.workflowId,
        inputs: data.inputs,
      };
      ws.current.send(JSON.stringify(workflowExecutionRequest));
    }
  } catch (error) {
    setErrorData({
      title: "There was an error executing the workflow",
      list: [(error as { message: string }).message],
    });
  }
}

In this example, WorkflowExecutionProps is a TypeScript interface you would define based on the inputs required to execute a workflow. The executeWorkflow function formats the request and sends it through the WebSocket connection. You would need to adjust the backend to handle this new request type and execute the specified workflow.

Remember, this approach requires both frontend and backend modifications. The backend must be capable of receiving workflow execution requests, processing them, and sending back the results.

Regarding the current structure of the WebSocket connection in the connectWS function and how it is used in the langflow codebase, the ChatService class in the src/backend/langflow/services/chat/service.py file manages WebSocket connections and handles messaging between the server and clients. It maintains a dictionary of active WebSocket connections (self.active_connections) indexed by client IDs. Each entry in this dictionary is a WebSocket object representing the connection to a client. The ChatService class also handles connection and disconnection, message sending, connection closure, message processing and history management, and WebSocket event handling.

As for the sendAll function, it does not exist in the provided context of the langflow codebase within the src/backend/langflow/services/chat/service.py file. The code snippet provided outlines the structure and functionality of the ChatService class, which includes methods for managing chat connections, sending messages, processing messages, and handling WebSocket connections, but does not include a sendAll function.

I hope this helps! If you have any further questions, feel free to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] avatar Mar 04 '24 09:03 dosubot[bot]

We have also built and tested OpenAI's Whisper in a non-chat flow, directly calling it through an API as you mentioned. Indeed, there is a critical need for a feature that allows testing inputs and outputs in a flow that does not include the LLM chain.

YamonBot avatar Mar 20 '24 06:03 YamonBot

I completely agree. We're actively working on this and, to be honest, we're already in the testing phase. We anticipate it will be available in just a few weeks at most.

anovazzi1 avatar Mar 21 '24 14:03 anovazzi1