Async functions as destinations
Eliot could support nonblocking destinations.
async def receive(request):
async with eliot.start_action():
async with trio.open_nursery() as nursery:
nursery.start_soon(one_handler, request)
nursery.start_soon(another_handler, request)
async def async_json_to_stderr(obj):
await stderr_stream.send_all(json.dumps(obj).encode())
eliot.add_destinations(async_json_to_stderr)
If actions and messages had some async methods.
class Action:
def __init__(self, **data):
self.data = data
async def __aenter__(self):
await log_to_destinations(self.data)
async def __aexit__(self, *args):
await log_to_destinations(self.data)
async def log_to_destinations(data):
for dest in destinations:
await dest(**data)
I'm not sure how I feel about this. Waiting for logging means logging can become a bottleneck. My preferred design would be shoving logs into a different thread where they can be written asynchronously, ala eliot.logwriter (which should probably lose the Twisted dependency).
The await here could serve the purpose of providing backpressure, which would be needed when the number of pending logs is growing faster than the ability to write them out, and prevent other issues like random out of memory errors. I think at the very least, the Queue implementation should have a sane default for max pending messages but I guess that's out of scope to this issue.
I think leveraging the event loop could be lighter-weight vs the locking required for putting/getting from a Queue, and I think one could take more liberties with implementation when not having a thread (ie other shared resources don't need as much cooperation, etc). I think it could be less of a bottleneck, but I guess only performance testing can reveal that?