JavaScript heap out of memory error when exporting large schemas
When using exportSchema with a large database schema (in my case 294 tables), the process fails with a JavaScript heap out of memory error. This issue occurs specifically with larger schemas, while smaller schemas work as expected.
Current workaround: Increasing the Node.js heap size limit using --max-old-space-size=8192 resolves the issue.
This issue suggests that the schema export process might need optimization for handling large schemas more efficiently.
My instincts tell me this is an issue in this function: https://github.com/graphile/crystal/blob/dc0d44696d929f2ffcb8725e5bcf9951db937cb1/utils/graphile-export/src/optimize/index.ts#L41
If you replace that function with:
export const optimize = (inAst: t.File, runs = 1): t.File => {
return inAst;
}
does the issue go away?
It seems your instincts were correct - the issue does indeed go away with that change
This also has a very big impact in export times I think!
I was trying to export a very large schema as well: 306 tables, 180 views, 90 functions.. and I had it running for 50 minutes already with the optimization turned on. I had also set it to use +16 GB of memory.
Also worth noting that aggregations are on for all relations by default:
extends: [PostGraphileAmberPreset, ConnectionFilterPlugin.PostGraphileConnectionFilterPreset, PgAggregatesPreset],
gather: {
pgIdentifiers: 'unqualified',
},
after seeing this very helpful post I tried disabling optimizations and the export was performed under a minute! noice 😄
the final schema without optmization is 52 MB FYI.. not sure whether this is indeed very big
EDIT:
I was actually not using the preset with the aggregation plugin.. after I enabled it, it ran for > 10 min even with optimizations disabled, and then it ran out of memory with 16 GB.. I think I should probably just optin more explicitly for the aggregations