node-mysql2
node-mysql2 copied to clipboard
FATAL ERROR JavaScript heap out of memory
I have a query that returns more than 300 thousand records in a table. My code is Dump.ts. After executing this query, the process memory collapses, I tried to fix it with the flags:
--max_old_space_size=6000 --optimize_for_size --max_executable_size=6000 --stack_size=6000 --trace-gc
The full error is:
<--- Last few GCs --->
[7600:000000949173C4B0] 107414 ms: Mark-sweep 1415.4 (1500.7) -> 1415.4 (1500.7) MB, 2778.0 / 0.0 ms allocation failure scavenge might not succeed
[7600:000000949173C4B0] 110134 ms: Mark-sweep 1415.4 (1500.7) -> 1415.4 (1484.7) MB, 2719.0 / 0.0 ms last resort
[7600:000000949173C4B0] 112782 ms: Mark-sweep 1415.4 (1484.7) -> 1415.4 (1484.7) MB, 2646.0 / 0.0 ms last resort
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 000001A98441CEA9 <JSObject>
0: builtin exit frame: stringify(this=000001A98442E2F9 <JSON map = 000003E13070EC79>,000001A984402241 <undefined>,000001A984402241 <undefined>,000002E8C122AD49 <JSArr
ay[1269410]>)
1: arguments adaptor frame: 1->3
2: normalizeObject [C:\Users\jul.mora\Documents\GitHub\Dump\dist\dump.js:~324] [pc=0000034FD8D4050C](this=0000030E7F424F49 <Dump map = 0000004357D8E121>,objMysql=0000
02E8...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
error Command failed with exit code 3.
thanks
Does this happen if you do same query in isolation? ( e.i only thing script does is connect to db, execute query and close connection )
How much total memory you have?
Hi @sidorares - Connects to db, executes queries, and closes connection correctly.
The features of my pc:
- Windows 8.1
- Node v8.4.0
- Total memory RAM: 8Gb
I did the same in python and it worked :'(
Thanks
@juliandavidmr Commit you linked to is doing a lot more that simple connect, query and fetch.
Could you try with simpler setup (without streams and other extra code you have). Looks like there is memory leak somewhere in code and not a issue with mysql2
You are putting those 300 000+ rows to memory before processing them, in addition to doing that for multiple tables at the same time. Because you are doing it in a rows.map(async ()=>) the VM will not wait until the previous dump is finished and will continue to the next iteration in the map for all of your rows. The promises generated will have to keep references to all results for every single async callback until they all complete and results consumed, which prevents the GC from collecting.
Replace your map with a for loop, preferably a for .. i loop, so that the next row doesn't get executed until you are done processing previous row's results and see if that helps. Also if you are getting so many results, it's probably not wise to put them to memory anyway - consider using streams.
Also it is generally not a good idea to use async functions as iteration callbacks because the underlying implementation (e.g. Array.map) may differ in how it calls them. Async functions return promises instantly when called without await, which means if you have 10000 elements in your array and you pass an async function to the forEach or map that will essentially execute 10000 functions in parallel, having to keep in memory every piece of data each of them accesses. Not to mention you don't get any reliable way to check if they are all finished. If you want parallelism, use Promise.all.
check this answer: StackOverflow