promscale icon indicating copy to clipboard operation
promscale copied to clipboard

Add batching for traces

Open niksajakovljevic opened this issue 2 years ago • 1 comments

Batching will help to achieve better ingest performance, especially if traces are sent one by one.

A batch is produced when one of two conditions is met: batch size or timeout.

This PR also adds async support for traces meaning that client doesn't need to wait for DB write. This increases ingest performance with a small risk of data loss.

niksajakovljevic avatar Aug 10 '22 11:08 niksajakovljevic

The local benchmarks that I've added are pointing that batching should provide better performance, however I still need to run full blown benchmark with more realistic data loads.

niksajakovljevic avatar Aug 10 '22 16:08 niksajakovljevic

Would be good to see perf difference with and without batching in case of Jaeger ingestion.

Yes I will be soon publishing benchmark numbers in this PR.

niksajakovljevic avatar Aug 16 '22 15:08 niksajakovljevic