Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

No streaming with Python

Open Petopp opened this issue 1 year ago • 9 comments

Hello everyone,

I am trying to process the response from the LLM from a Flowise endpoint in a structured way, e.g. to have it output in Streamlit as you know it from OpenAI etc..

Unfortunately I am not able to do this. The response as such works, but unfortunately not as a stream.

import requests


API_URL = "http://192.168.0.133:7000/api/v1/prediction/e8c074c0-0956-4cdf-9786-86b0aa47a989"

def query(payload):
    
    response = requests.post(API_URL, json=payload, stream=True)
    
    if response.status_code == 200:
        for line in response.iter_lines():
            if line:
                # Verarbeitung der Streaming-Daten hier
                data = line.decode('utf-8')
                print("Stream:", data)
                
    else:
        print("Error:", response.status_code)

# Beispielabfrage
query({
    "question": "How fast it's the light?",
    "overrideConfig": {
        "sessionId": "user1"
    }
})

Does anyone have any ideas on how to do this?

Here the Flowise part:

image

Petopp avatar May 18 '24 21:05 Petopp

Flowise Version 1.7.2

Petopp avatar May 18 '24 23:05 Petopp

also want to know how to handle the stream with python

csningli avatar May 21 '24 06:05 csningli

for now you will have to use socket IO for streaming - https://docs.flowiseai.com/using-flowise/streaming we're working on to change that to SSE

HenryHengZJ avatar May 31 '24 22:05 HenryHengZJ

Hey Henry, how is it going with the SSE? @HenryHengZJ

xu-dong-bl avatar Jun 12 '24 12:06 xu-dong-bl

Hi Team,

Could you please help me with this Python code?

The nodejs version is working fine.

import asyncio
import json
import aiohttp
import socketio

SERVER = "http://localhost:54000"

sio = socketio.AsyncClient(logger=True, engineio_logger=True)


async def query(data):
    async with aiohttp.ClientSession() as session:
        async with session.post(
                f"{SERVER}/api/v1/prediction/7af1f9f3-dd43-4ce4-b76e-bf45103010f5",
                headers={"Content-Type": "application/json"},
                data=json.dumps(data),
        ) as response:
            result = await response.json()
            return result


@sio.event
async def connect():
    print('connected to the server')
    question = "Hey, how are you?"
    result = await query({"question": question, "socketIOClientId": sio.sid})
    print(json.dumps(result))


@sio.on('start')
async def start():
    print("start event received")


@sio.on('token')
async def token(token):
    print(f"token event received with token: {token}")


@sio.on('end')
async def end():
    print("end event received")
    await sio.disconnect()


async def main():
    await sio.connect(SERVER)
    await sio.wait()


asyncio.run(main())

It's not streaming anything. None of the @sio.on decorators are getting triggered.

saidharanidhar avatar Jul 03 '24 21:07 saidharanidhar

I have similar code as @saidharanidhar , and for me also none of the @sio.on decorators are triggered.

How can I stream the actual tokens using python and the flowise API?

dentroai avatar Jul 15 '24 09:07 dentroai

Hey guys I figured out how to use streaming with Flowise API and python.

This discord message solved it: https://discord.com/channels/1087698854775881778/1198668613687726160/1198955953580687370

Here's a working (test-) script for anybody else looking for a solution:

import socketio
import requests
import logging
import time

# Set up logging
logging.basicConfig(level=logging.INFO)

sio = socketio.Client(logger=False, engineio_logger=False) # Set to True for debugging

@sio.event
def connect():
    print("Connected to Flowise server")
    query()

@sio.event
def connect_error(data):
    print(f"Connection error: {data}")

@sio.event
def disconnect():
    print("Disconnected from Flowise server")

@sio.on('start')
def on_start(arg):
    print("Streaming started")

@sio.on('token')
def on_token(token):
    print(f"{token}", end='', flush=True)

@sio.on('end')
def on_end():
    print("\nStreaming ended")
    sio.disconnect()

@sio.on('sourceDocuments')
def on_source_documents(docs):
    print("\nSource Documents:", docs)

@sio.on('usedTools')
def on_used_tools(tools):
    print("\nUsed Tools:", tools)

@sio.on('nextAgent')
def on_next_agent(agent):
    print("\nNext Agent:", agent)

@sio.on('agentReasoning')
def on_agent_reasoning(reasoning):
    print("\nAgent Reasoning:", reasoning)

def query():
    url = "https://your-flowise-server.com/api/v1/prediction/e82b0a5f-7ad8-61c2-b94e-9891c1de33bf"
    data = {
        "question": "Who won the European football championship 2024 yesterday ? It was Spain vs England",
        "socketIOClientId": sio.get_sid(),
        "stream": True
    }
    try:
        response = requests.post(url, json=data)
        response.raise_for_status()
        print("Query sent. Waiting for response...")
        print(f"Response status: {response.status_code}")
        print(f"Response content: {response.text[:100]}...")  # Print first 100 characters
    except requests.exceptions.RequestException as e:
        print(f"Error sending query: {e}")

if __name__ == '__main__':
    try:
        sio.connect('https://your-flowise-server.com', transports=['websocket', 'polling'])
        
        # Set a timeout of 60 seconds
        timeout = time.time() + 60
        while time.time() < timeout:
            sio.sleep(1)
            if not sio.connected:
                break
        
        if sio.connected:
            print("Timeout reached. Disconnecting...")
            sio.disconnect()
        
    except Exception as e:
        print(f"An error occurred: {e}")
    finally:
        if sio.connected:
            sio.disconnect()

Also I had to install pip install websocket-client to make sure it uses the websocket connection instead of polling.

dentroai avatar Jul 15 '24 11:07 dentroai

Thank you @dentro-innovation for the example demo!

HenryHengZJ avatar Jul 15 '24 13:07 HenryHengZJ

Heyy,

is there any possebility to create a pipline between Flowise and Openwebui with streaming? I have a normal Pipline but without streaming and I can´t get it too run with chatgpt and my low coding knowledge

Chase295 avatar Aug 20 '24 15:08 Chase295

We finally released python SDK with streaming: https://github.com/FlowiseAI/FlowisePy

# Create a prediction with streaming enabled
    completion = client.create_prediction(
        PredictionData(
            chatflowId="abc",
            question="Tell me a joke!",
            streaming=True  # Enable streaming
        )
    )

    # Process and print each streamed chunk
    print("Streaming response:")
    for chunk in completion:
        print(chunk)

Example streamlit repo: https://github.com/HenryHengZJ/flowise-streamlit

HenryHengZJ avatar Sep 19 '24 15:09 HenryHengZJ

I am connected to openrouter and its not even streaming in flowise, let alone the service using flowise. Not sure how to fix it.

dillfrescott avatar Sep 21 '24 20:09 dillfrescott