extensions
extensions copied to clipboard
🐛 [firestore-bigquery-export] The option "--batch-size" does not work when I set "--multi-threaded" to true.
[READ] Step 1: Are you in the right place?
Issues filed here should be about bugs for a specific extension in this repository. If you have a general question, need help debugging, or fall into some other category use one of these other channels:
- For general technical questions, post a question on StackOverflow with the firebase tag.
- For general Firebase discussion, use the firebase-talk google group.
- To file a bug against the Firebase Extensions platform, or for an issue affecting multiple extensions, please reach out to Firebase support directly.
[REQUIRED] Step 2: Describe your configuration
- Extension name: @firebaseextensions/fs-bq-import-collection
- Extension version:
- @firebaseextensions/[email protected]
- Configuration values (redact info where appropriate):
- Ommited.
[REQUIRED] Step 3: Describe the problem
Steps to reproduce:
fs-bq-import-collection \
--non-interactive \
--project ${MY_PROJECT_NAME}\
--source-collection-path users/{user_id}/bookmarks \
--query-collection-group true \
--dataset firestore_export \
--table-name-prefix bookmarks \
--batch-size 300 \
--dataset-location us \
--multi-threaded true \
--use-new-snapshot-query-syntax true \
--use-emulator false
Expected result
...
{"severity":"INFO","message":"Inserted 300 row(s) of data into BigQuery"}
...
Actual result
...
{"severity":"INFO","message":"Inserted 7584 row(s) of data into BigQuery"}
...
Hi,
This is partially resolved now in 0.1.23. This was actually the source of another bug, as there was no limit to batches for the script.
in multi-threaded mode, the --batch-size should now act as a maximum batch size. if the script is filtering using wildcards then some batches may not be full.
I will leave this issue open as it's not completely fixed, and we will review and update you as soon as updates on it are available.
Thanks for your patience!