Meteorix

Results 15 comments of Meteorix

请提供尽量完整的信息,截图、脚本、log等

`run_redis_workers_forever`这个就是快捷方法,没有封装其他参数。可以自己抄一下这个函数,传进去。

> 我的场景是: > 希望在一块大显存GPU上开多个worker,然后每个worker支持批量推理 > 应该是最适合使用RedisStreamer类,对么 这种可以用多进程的stream就够了

可能不能直接支持,tornado有自己的事件循环,跟这里面的thread会有冲突。你可能需要自己将tornado接收的请求放到一个任务队列,再用streamer进行batching

@m4gr4th34 thanks for your interest. The main difference between `Streamer` and `ThreadedStreamer` is that `Streamer` uses multi-process. So it will occupy double gpu memory when there are 2 workers. Another...

yes, take a look: https://github.com/ShannonAI/service-streamer#distributed-gpu-worker

take a look at this example: https://github.com/ShannonAI/service-streamer/blob/master/example/redis_streamer_gunicorn.py

> @MAhaitao999 我自己写了一个 欢迎提PR到这个仓库, 我可以帮忙review

Try add `freeze_support` as suggested. Btw, the default multi processing in this code is using `spawn` not `fork`.