Error, if chunk size > 128kb
hello ,
If the chunk size exceeds the maximum string length of V8, then the following error will occur:
Error: toString failed
at Buffer.toString (buffer.js:377:11)
this line will fail:
var matches = /for table `(.*?)`/.exec(chunk.toString('utf8'));
Are you running on windows?
Yes, windows 10
That's the reason. Are you able to share a minimum SQL which have such problem?
Il 23 mag 2017 09:52, "Konstantin Krassmann" [email protected] ha scritto:
Yes, windows 10
— You are receiving this because you commented.
Reply to this email directly, view it on GitHub https://github.com/vekexasia/mysqldumpsplit/issues/2#issuecomment-303318429, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMPS75QTiwboZt7m5dOn5SpKuvPurOjks5r8pA2gaJpZM4KTj3R .
Also doing the math in my head i'm pretty sure that couldn't be the line causing the problem.
Are you willing to share the full stacktrace?
On Tue, May 23, 2017 at 10:00 AM, Andrea Baccega [email protected] wrote:
That's the reason. Are you able to share a minimum SQL which have such problem?
Il 23 mag 2017 09:52, "Konstantin Krassmann" [email protected] ha scritto:
Yes, windows 10
— You are receiving this because you commented.
Reply to this email directly, view it on GitHub https://github.com/vekexasia/mysqldumpsplit/issues/2#issuecomment-303318429, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMPS75QTiwboZt7m5dOn5SpKuvPurOjks5r8pA2gaJpZM4KTj3R .
--
Andrea Baccega [image: Email][email protected] [image: Hangouts][email protected] [image: Personal site] http://www.andreabaccega.com/ [image: Linkedin Profile] http://it.linkedin.com/in/andreabaccega [image: Facebook] https://www.facebook.com/andrea.baccega [image: Google+] https://plus.google.com/109217393200753135791 [image: Twitter] http://twitter.com/veke87 [image: StackOverflow] http://stackoverflow.com/users/314407/andrea-baccega [image: GitHub] https://github.com/vekexasia
Well, i dont have the files to reproduce the error anymore.
Just take a table with > 1 gb of data with INSERT statements. (~ 15 mil rows).