Flask-SocketIO
Flask-SocketIO copied to clipboard
Emitting from instance to user that connected to another instance
Hi, is there an example of emitting from instance that user is not connected to? For example what if I have 2 instances of flask app with socketio server (run on different servers), balancer choose one of them for the client A and another for client B, when client B performs specific action, the instance it connected to should emit a message to client A, but client A is connected to another instance of app. Is it enough just to set redis queue and emit as usual? every instance have this code:
socketio = SocketIO()
socketio.init_app(app, message_queue=config.REDIS_URL, async_mode='eventlet')
def action():
socketio.emit(event, data, room=room) # room is sid of client connected to other instance of app
and is flask-socketio handle all redis stuff automatically? The way I was thinking of is to publish messages to redis and make listener that check if this message addressed to client that connected to current instance and then emit
You don't have to do anything besides providing the Redis connection URL, Flask-SocketIO handles all the pub/sub traffic necessary to implement emits and broadcasts.
You don't have to do anything besides providing the Redis connection URL, Flask-SocketIO handles all the pub/sub traffic necessary to implement emits and broadcasts.
Thanks, will try :)
Hi Migual, thanks for amazing package, and clear documentation. I have 3 nodes with Flask-SocketIO instances, haproxy as load balancer , and Redis as message queue, emiting to room succeeds to reach all clients on all nodes, only if the emit is done from the node which redis instance is the master.
app = Flask(name) socketio = SocketIO(app, cors_allowed_origins='*', message_queue="redis://localhost:6379/", logger=self.log, engineio_logger=self.log, async_mode='eventlet')
def event_emit(self): command_json = request.json data = self.create_data_obj(command_json['data']) event = command_json['event'] room = command_json.get('room', '') self.log.info("emitting {} event, (update)".format(event)) self.socketio.emit(event, data, room=room) return jsonify({"status": "ok"}), 200
have I been missing something ? do I need special configuration to redis cluster ?
Are the other nodes connecting to redis? I don't see why it would matter which node is the master, aren't all connections the same?
Thanks for quick response, I have redis cluster with 2 slaves and one master, one on each node. This redis cluster is used for DB as well, and connected & working. Emit from node with redis master - reaches clients on all nodes, when emit from a node with a slave it reaches just the slave's clients.
After googling it, I found that when publishing event to a slave it is updating only the clients subscribed to it, and when to a master - it is publishing to its clients and its slave's clients. So my q is - this kind of configuration of redis should work with Flask-SocketIO?
@Yael-F This is pub/sub, there is no storage, I have not considered the idea of using a cluster. I guess it is possible, but the current solution assumes all the instances connect to the same pub/sub instance.
Hi Migual, thanks for great support, I solved it by implementing my own client_manager which inheritances RedisManager, and using redis sentinel for getting redis master for publishing,
class RedisClusterManager(RedisManager):
log = logging.getLogger()
master = None
def _redis_connect(self):
super()._redis_connect()
is_slave = self.redis.info()['role'] == 'slave'
self.log.info(f"Redis is slave: {is_slave}")
def _get_master(self):
self.log.info(f"Getting redis master for socket pub connection")
from redis.sentinel import Sentinel
self.sentinel = Sentinel([('localhost', 26379)], socket_timeout=0.1)
self.master = self.sentinel.master_for("redis_cluster", db=0, socket_timeout=1)
def _publish(self, data):
self.log.info("Publishing to redis master")
retry = True
while True:
try:
self._get_master()
return self.master.publish(self.channel, pickle.dumps(data))
except redis.exceptions.ConnectionError:
if retry:
self.log.error('Cannot publish to redis master... retrying')
retry = False
else:
self.log.error('Cannot publish to redis master... giving up')
break
app = Flask(name)
self.socketio = SocketIO(self.app, cors_allowed_origins='*',
client_manager=RedisClusterManager(), logger=self.log, engineio_logger=self.log)
Have you considered supporting redis sentinel for message queue ?
Nobody proposed it before, so not until now. I'll look at adding your code as another option. Thanks!
Closing this, as I'm now tracking this feature more generically: https://github.com/miguelgrinberg/python-socketio/issues/1172