Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allocation failed - JavaScript heap out of memory #815

Open
Lusitaniae opened this issue Jul 9, 2021 · 3 comments
Open

Allocation failed - JavaScript heap out of memory #815

Lusitaniae opened this issue Jul 9, 2021 · 3 comments

Comments

@Lusitaniae
Copy link

Lusitaniae commented Jul 9, 2021

Expected behavior

Running the export would not cause out of memory errors and complete successfully.

This is being run from a 4GB CI/CD instance, not sure how much data we're keeping in the firebase collections as GCP doesn't show much.

Actual behavior

Starting Export 🏋️
Retrieving documents from collectionA
Retrieving documents from collectionB
Retrieving documents from collectionC
<--- Last few GCs --->
[59:0x55e45951f140]   320807 ms: Mark-sweep 1925.1 (1958.9) -> 1918.9 (1961.5) MB, 3710.1 / 0.0 ms  (average mu = 0.109, current mu = 0.024) allocation failure scavenge might not succeed
[59:0x55e45951f140]   324495 ms: Mark-sweep 1927.2 (1977.5) -> 1920.9 (1978.7) MB, 3601.3 / 0.0 ms  (average mu = 0.068, current mu = 0.024) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
/bin/bash: line 151:    59 Aborted                 (core dumped) firestore-export --accountCredentials $GOOGLE_APPLICATION_CREDENTIALS --backupFile export.json --prettyPrint

Steps to reproduce the behavior

Try to export a large collection(s)?

Workaround

Increase node memory limit can work aroudn the issue, but ideally we don't need to hold all data in RAM?

export NODE_OPTIONS=--max_old_space_size=4096

More details

Running the export on my laptop (higher specs), I can see the resulting file size is only 27M. Kind of surprising it needs multiple Gb of RAM to run online

@Lusitaniae
Copy link
Author

This is solved by #346 (open PR)

@nabilfreeman
Copy link

I had an insanely larger firestore DB to export, so increasing node swap wasn't an option.

Ended up using https://www.npmjs.com/package/firestore-backfire which works without bugs

@adarshmadrecha
Copy link

My database contains about 20Lac records. I got below error after 2 hours.
I am sure, this operation would have caused a good amount of wasted money 😥 will have to wait till month end to view the bill from Google Cloud.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants