influxdb-client-python icon indicating copy to clipboard operation
influxdb-client-python copied to clipboard

Automated split of large number of points into smaller chunks

Open powersj opened this issue 3 years ago • 2 comments

In Python 2.x client library: Automated split of large number of points into smaller chunks where minimal chunk size is user-defined batch size, for synchronous WriteAPI.

powersj avatar Sep 13 '22 18:09 powersj

@bednar I've converted this into an issue so we can track our conversation about this.

I am still not convinced that this makes sense doing. The current library has batching already. We even have an example that shows using the rx library to do sync writes with batching as well. Having compatibility with the previous library is not necessary something we guarantee. I am inclined to close this as won't fix.

Thoughts?

cc: @popey @sjwang90

powersj avatar Sep 13 '22 18:09 powersj

@powersj the main reason for this is simplify the usage of client for python beginners.

The import_data_set_sync_batching.py example works well but from my POV something like following code will be pretty useful:

write_api = client.write_api(write_options=WriteOptions(write_type=WriteType.synchronous, batch_size=5_000))

try:
    write_api.write(bucket='my-bucket', record=csv_to_generator('vix-daily.csv'))
except ApiException as e:
    print(f"Something is wrong: {e}")

I am inclined to close this as won't fix.

We can do it that way and refer users to import_data_set_sync_batching.py

bednar avatar Sep 14 '22 11:09 bednar