azure-event-hubs-spark
azure-event-hubs-spark copied to clipboard
Allow multiple partition senders to be open at once
Currently, if another partition sender is needed, the existing one is closed.
Is this the issue that's causing preventing Databricks from opening up more than one EventHub sinks on a single spark cluster?
No, this doesn't prevent anything in the sink - the current implementation is slow if your schema has partition
in it. If you're sending round-robin or with a partition key, then this isn't relevant (and doesn't affect performance).