node-firestore-import-export
node-firestore-import-export copied to clipboard
Allocation failed - JavaScript heap out of memory
Expected behavior
Running the export would not cause out of memory errors and complete successfully.
This is being run from a 4GB CI/CD instance, not sure how much data we're keeping in the firebase collections as GCP doesn't show much.
Actual behavior
Starting Export 🏋️
Retrieving documents from collectionA
Retrieving documents from collectionB
Retrieving documents from collectionC
<--- Last few GCs --->
[59:0x55e45951f140] 320807 ms: Mark-sweep 1925.1 (1958.9) -> 1918.9 (1961.5) MB, 3710.1 / 0.0 ms (average mu = 0.109, current mu = 0.024) allocation failure scavenge might not succeed
[59:0x55e45951f140] 324495 ms: Mark-sweep 1927.2 (1977.5) -> 1920.9 (1978.7) MB, 3601.3 / 0.0 ms (average mu = 0.068, current mu = 0.024) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
/bin/bash: line 151: 59 Aborted (core dumped) firestore-export --accountCredentials $GOOGLE_APPLICATION_CREDENTIALS --backupFile export.json --prettyPrint
Steps to reproduce the behavior
Try to export a large collection(s)?
Workaround
Increase node memory limit can work aroudn the issue, but ideally we don't need to hold all data in RAM?
export NODE_OPTIONS=--max_old_space_size=4096
More details
Running the export on my laptop (higher specs), I can see the resulting file size is only 27M. Kind of surprising it needs multiple Gb of RAM to run online
This is solved by #346 (open PR)
I had an insanely larger firestore DB to export, so increasing node swap wasn't an option.
Ended up using https://www.npmjs.com/package/firestore-backfire which works without bugs
My database contains about 20Lac records. I got below error after 2 hours. I am sure, this operation would have caused a good amount of wasted money 😥 will have to wait till month end to view the bill from Google Cloud.
