pg2mysql
pg2mysql copied to clipboard
Got a 7GB pgsql dump
Hi there,
I have got a 7.7GB pgsql dump file to be switched into a mysql dump.
However my PHP memory limit is 512MB and my system RAM is 4GB with currently 700MB free.
How can we split the parsing so that pg2mysql divides the mysql dumps into various number of parts?
COPY cn (id, did, fid, l, m, ff, g, by, db, id_rc, id_rd, ac, ad, an, sa, dn, misc) FROM stdin;
... multiple
... lines
... of
... loooong
... data
... here
\.
I haven’t looked at this in a very long time. This was based on some other abandoned code.
There are a few strategies that come to mind for you. I haven’t explored them, they’re just off the top of my head. Roughly ordered from easiest to most difficult.
-
see if you can get a machine with more resources to do the conversion, maybe connecting from your laptop?
-
see if you and break apart the dump file by insert statement for the various tables. (And maybe later a single table into multiple inserts with a regex)
-
bring up the MySQL DB and in your app write a script that copies a row from PG and writes to MySQL
-
update this code here to work as a stream or pipe rather than loading everything into memory
I realize the above aren’t super helpful. I mostly inherited this code and the community has pushed it forward here and there.
On Mon, Nov 18, 2019 at 10:38 AM SubZero5 [email protected] wrote:
Hi there,
I have got a 7.7GB pgsql dump file to be switched into a mysql dump. However my PHP memory limit is 512MB and my system RAM is 4GB with currently 700MB free. How can we split the parsing so that pg2mysql divides the mysql dumps into various number of parts?
COPY cn (id, did, fid, l, m, ff, g, by, db, id_rc, id_rd, ac, ad, an, sa, dn, misc) FROM stdin; ... multiple ... lines ... of ... loooong ... data ... here .
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ChrisLundquist/pg2mysql/issues/23?email_source=notifications&email_token=AABMICEG2KYBOJOFWXC5FW3QULOIVA5CNFSM4JOYL4JKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4H2D5S4A, or unsubscribe https://github.com/notifications/unsubscribe-auth/AABMICFWKKIA62OSZPXRDELQULOIVANCNFSM4JOYL4JA .
appears related to https://github.com/ChrisLundquist/pg2mysql/issues/2
On Mon, Nov 18, 2019 at 10:51 AM CLundquist [email protected] wrote:
I haven’t looked at this in a very long time. This was based on some other abandoned code.
There are a few strategies that come to mind for you. I haven’t explored them, they’re just off the top of my head. Roughly ordered from easiest to most difficult.
see if you can get a machine with more resources to do the conversion, maybe connecting from your laptop?
see if you and break apart the dump file by insert statement for the various tables. (And maybe later a single table into multiple inserts with a regex)
bring up the MySQL DB and in your app write a script that copies a row from PG and writes to MySQL
update this code here to work as a stream or pipe rather than loading everything into memory
I realize the above aren’t super helpful. I mostly inherited this code and the community has pushed it forward here and there.
On Mon, Nov 18, 2019 at 10:38 AM SubZero5 [email protected] wrote:
Hi there,
I have got a 7.7GB pgsql dump file to be switched into a mysql dump. However my PHP memory limit is 512MB and my system RAM is 4GB with currently 700MB free. How can we split the parsing so that pg2mysql divides the mysql dumps into various number of parts?
COPY cn (id, did, fid, l, m, ff, g, by, db, id_rc, id_rd, ac, ad, an, sa, dn, misc) FROM stdin; ... multiple ... lines ... of ... loooong ... data ... here .
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ChrisLundquist/pg2mysql/issues/23?email_source=notifications&email_token=AABMICEG2KYBOJOFWXC5FW3QULOIVA5CNFSM4JOYL4JKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4H2D5S4A, or unsubscribe https://github.com/notifications/unsubscribe-auth/AABMICFWKKIA62OSZPXRDELQULOIVANCNFSM4JOYL4JA .
Even with 16RM On line 24 in pg2mysql.inc.php I saw
ini_set("memory_limit", "512M");
which I tried to increase to ini_set("memory_limit", "7G");
.
Finally the script stops at line 419 where the resulting MySQL code is contaminated in a very long string to be returned later. This is causes the error:
PHP Fatal error: Allowed memory size of 4294967296 bytes exhausted (tried to allocate 241600600 bytes) in /Volumes/DATA0/nise/Documents/proj_002_aple/FernUniDaten/moodle-exporter/pg2mysql/pg2mysql.inc.php on line 419
See also https://www.airpair.com/php/fatal-error-allowed-memory-size
i have 16 gig ram but im stuck at this screen for 30 minutes and still waiting
to convert big file, change file pg2mysql.inc.php ... +$COPY = ''; $COPY_str=0; $COPY_maxrec = 500; while ($instr=fgets($infp)) { ... +if(strpos($instr, "COPY")!==false) { $COPY = $instr; $COPY_str=0; } $pgsqlchunk[]=$instr; +$COPY_str++; ... if ( $progress == 1.0 || (strlen($instr)>3 && ($instr[$len-3]==")" && $instr[$len-2]==";" && $instr[$len-1]=="\n") && $inquotes==false) ||
-
) { +if($COPY_str==$COPY_maxrec) $pgsqlchunk[]= "\.\n"; $chunkcount++; ... $pgsqlchunk=array(); +if($COPY_str==$COPY_maxrec) {$COPY_str==$COPY_maxrec
- $pgsqlchunk[]= $COPY;
- $COPY_str=0; +} ...