content
content copied to clipboard
Large entries generate invalid chunked INSERT
Environment
- Operating System:
Darwin - Node Version:
v22.17.1 - Nuxt Version:
4.2.0 - CLI Version:
3.30.0 - Nitro Version:
2.12.9 - Package Manager:
[email protected] - Builder:
- - User Config:
compatibilityDate,devtools,modules,vite - Runtime Modules:
@nuxt/[email protected] - Build Modules:
-
Version
v3
Reproduction
https://stackblitz.com/~/github.com/NathanDubord/content-v3-test
Description
Hi, I'm trying to migrate a few thousand blog posts on about.gitlab.com from Content v2 to v3. When migrating I came across a blog post that is misbehaving.
I can't seem to find a reason this particular file fails, I've tried removing parts of the blog post, but it's inconsistent. Sometimes deleting part of the blog post works, but it's never consistent. Any idea what's going on?
Additional context
- Same issue with better-sqlite3.
- Created a 500kb yml file to test size, but didn't have an issue.
- Removed all instances of { in the troubled file, but still saw the issue.
Logs
ERROR [unhandledRejection] SQLITE_ERROR: unrecognized token: "{" 1:15:17 PM
at sqliteError (node_modules/sqlite3/lib/sqlite3.pure.js:1:94706)
at 51656 (node_modules/sqlite3/lib/sqlite3.pure.js:1:12557)
at _emscripten_asm_const_int (node_modules/sqlite3/lib/sqlite3.pure.js:1:84040)
at ccall (node_modules/sqlite3/lib/sqlite3.pure.js:1:5494)
at Object.eval (node_modules/sqlite3/lib/sqlite3.pure.js:1:5844)
at eval (node_modules/sqlite3/lib/pure/database.js:86:26)
at processTicksAndRejections (node:internal/process/task_queues:196:998)
at Object.callback (https://contentv3test-001g.w-credentialless-staticblitz.com/blitz.cf284e50.js:31:296069)
WARN Error: SQLITE_ERROR: unrecognized token: "{" 1:15:17 PM
[1:15:17 PM] ERROR [uncaughtException] abort(Error: SQLITE_ERROR: unrecognized token: "{"). Build with -s ASSERTIONS=1 for more info.
at _0xf8839a.abort (node_modules/sqlite3/lib/sqlite3.pure.js#cjs:1:10598)
at _0xf8839a.emit (https://contentv3test-001g.w-credentialless-staticblitz.com/builtins.97a3df4f.js:30:11150)
at emitUnhandledRejection (https://contentv3test-001g.w-credentialless-staticblitz.com/builtins.97a3df4f.js:193:3439)
at throwUnhandledRejectionsMode (https://contentv3test-001g.w-credentialless-staticblitz.com/builtins.97a3df4f.js:193:4262)
at processPromiseRejections (https://contentv3test-001g.w-credentialless-staticblitz.com/builtins.97a3df4f.js:193:5015)
at processTicksAndRejections (https://contentv3test-001g.w-credentialless-staticblitz.com/builtins.97a3df4f.js:196:1218)
at https://contentv3test-001g.w-credentialless-staticblitz.com/blitz.cf284e50.js:31:545396
at https://contentv3test-001g.w-credentialless-staticblitz.com/blitz.cf284e50.js:31:29480
ERROR [unhandledRejection] SQLITE_ERROR: no such function: CONCAT 1:15:17 PM
at sqliteError (node_modules/sqlite3/lib/sqlite3.pure.js:1:94706)
at 51656 (node_modules/sqlite3/lib/sqlite3.pure.js:1:12557)
at _emscripten_asm_const_int (node_modules/sqlite3/lib/sqlite3.pure.js:1:84040)
at ccall (node_modules/sqlite3/lib/sqlite3.pure.js:1:5494)
at Object.eval (node_modules/sqlite3/lib/sqlite3.pure.js:1:5844)
at eval (node_modules/sqlite3/lib/pure/database.js:86:26)
at processTicksAndRejections (node:internal/process/task_queues:196:998)
at _0x339a53 (https://contentv3test-001g.w-credentialless-staticblitz.com/blitz.cf284e50.js:31:538074)
WARN Error: SQLITE_ERROR: no such function: CONCAT
I asked codex to help me debug this, and this is what it came back with:
- The SQLite failure came from the _content_blog INSERT: chunking logic in @nuxt/content was adding an extra empty string literal so the SQL looked like ... "title":"Hello"}'', '{"title":..., and SQLite stopped at that stray {.
- Added a guard in node_modules/@nuxt/content/dist/module.mjs (line 2386) and node_modules/@nuxt/content/dist/module.cjs (line 2402) so the chunking branch is skipped when the largest value is already shorter than the 70 kB slice size, leaving the INSERT unchanged (≈101 kB) and syntactically valid.
node_modules/@nuxt/content/dist/module.mjs
This is way outside my level of expertise, but it did fix the issue.