Rabbit Batch Subscriber approach
I am looking to migrate from celery to faststream and I am just wondering what would be the best way to archive a Rabbit Batch Subscriber similar to https://celery-batches.readthedocs.io/en/latest/how_it_works.html?
celery worker process to queue tasks in memory until either the flush_interval or flush_every is reached and passes that list of tasks to the worker in the processing pool together.
I think, it is a bad abstraction, that hides from user how the thing works. I don't like an idea to accumulate messages in memory before pass them to user, so this functional could be added as an example in the How-To section, but we shouldn't create a special API for it.
I am ready to take this on!