formidable
formidable copied to clipboard
trimmed v3 benchmarks
Used @mjackson 's benchmarks suite, seems like Formidable v3 is performing pretty well, especially in large files.
link https://github.com/mjackson/remix-the-web/pull/66 where we can find the trimmed down formidable v3 (using directly the parser class and the tweaked multpart plugin that doesn't use this)
nice
there's certain feeling that after so many years you're provably still the best parser lol.
Tho, it's shocking that it's okay for large files because it's a byte-by-byte streaming parser. That must hurt perf but it isn't somehow.
Actively playing with different stuff. And mjackson's is hard to beat, tho with far less control on the core level. People can use his and built linits on top, but it's even better to have some of the limits baked in on the parser level. Tho, that's a bit questionable cuz in the streaming process you should move through the part's body anyway but still tho if you don't event emit/yield this part's body to the up-level you can save on some overhead on that level.. tho, in the upper lever for await of loop, that gets the part objects, user (or higher level wrapper) can just validate and out continue.
tldr: so so. 😅 good extensive test suite and good extensive benchmarks are needed. I'll push a new branch when I'm ready with more final and stable thing
v4 published on next npm dist-tag
Docs and basic examples at: https://www.npmjs.com/package/formidable/v/4.0.0-rc.3
it's a fork of the @mjackson/multipart-parser, because A) we can't build on top directly for now, and B) it's unbeatable. The limit options have to be inside the parser to be able to skip calling the handler for that part if it's above the limits.
api
import {
parseMultipartRequest,
parseMultipart,
formidableDefaultOptions,
FormidableError,
type FormidableOptions,
type FormidablePart,
} from 'formidable';
// For node.js v14 and v16 compat
// import { parseMultipartRequest } from 'formidable/node14';
// import { parseMultipartRequest } from 'formidable/node16';
try {
await parseMultipartRequest(req /* Request*/, options, async (part: FormidablePart) => {
// part.type - part content type, media type (no charsets)
// ==== Headers ====
// part.headers - part headers, parsed into type-safe SuperHeaders object
// part.headers.contentType.mediaType - part content media type
// part.headers.contentType.charset - part content charset
// part.headers.contentDisposition.name
// part.headers.contentDisposition.filename
// part.headers.contentDisposition.preferredFilename
// part.headers.contentDisposition.type - attachment or inline or form-data
// ==== Streaming ====
// await part.stream() - no buffering, async iterable, useful to use in `for await (const chunk of part.stream())`
// ==== Buffering ====
// await part.text(failSafe?) - buffer into string, pass failSafe = true to avoid crashing
// await part.bytes() - buffer into Uint8Array bytes
// await part.arrayBuffer() - buffer into ArrayBuffer
// await part.json() - buffer into JSON object, pass failSafe = true to avoid crashing
// ==== Utils ====
// part.toString() - string representing the state of properties (name, filename, type, headers)
// part.toObject() - the core of `toString()`, returns an object with the properties (name, filename, type, headers)
// part.isFile() - check if the part is a file
if (part.isFile()) {
console.log('file', part.name, part.filename, part.toString());
} else {
// part.text() on field gets the input's value
console.log('field', part.name, await part.text());
}
});
} catch (er: FormidableError) {
switch (er.code) {
case 'ERR_INVALID_INPUT':
console.error(er.message);
break;
case 'ERR_BODY_CONSUMED':
console.error(er.message);
break;
case 'ERR_FAILED_TO_PARSE_TEXT':
console.error(er.message);
break;
case 'ERR_FAILED_TO_PARSE_JSON':
console.error(er.message);
break;
case 'ERR_NO_BOUNDARY':
console.error(er.message);
break;
case 'ERR_MAX_FILENAME_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_FILE_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_FILE_KEY_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_FIELD_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_FIELD_KEY_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_HEADER_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_HEADER_KEY_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_HEADER_VALUE_SIZE':
console.error(er.message);
break;
case 'ERR_MAX_ALL_HEADERS_SIZE':
console.error(er.message);
break;
}
}
Any chance you could include Formidable@1 in those benchmarks? ;) I'd be curious how performance stacks and/or changes between v1, v2 and finally v3 :)
@TheThing not necessarily interested in that. It's bad. It's a lot of bloat, it's deprecated for years, it has security flaws, and it's slow for sure. I've seen benchamarks against Busboy and it's slower. Thinner and newer stuff is faster, no way around that.