fastify-multipart
fastify-multipart copied to clipboard
limits.fileSize not working properly with different files
Prerequisites
- [X] I have written a descriptive issue title
- [X] I have searched existing issues to ensure the bug has not already been reported
Fastify version
4.x.x
Plugin version
8.3.0
Node.js version
LTS
Operating system
Linux
Operating system version (i.e. 20.04, 11.3, 10)
Debian 12
Description
I have very different behavion on different files. Currently only tested with 2 image files, one is JPG ~2.3 Mb, another one is upscaled PNG ~22 Mb. Using following code:
const fastify = require('fastify')();
fastify.register(require('fastify-multipart'));
fastify.post('/upload', async function (req, res) {
let file = await req.file({ limits: { fileSize: 1024 * 1024 }, throwFileSizeLimit: true });
if (file.file.truncated) {
console.log('File too large');
res.send('File too large');
return res;
}
res.send('OK');
});
fastify.listen(3000, (err) => {
if (err) throw err;
console.log(`server listening on ${fastify.server.address().port}`);
});
Problem is, while uploading one file, file limit is working fine and when uploading another one, it isn't working at all. If i read it right, limits are in bytes, so, using (1024*1024) should be limited to 1 Mb. Here are two responses I get:
<ref *1> {
type: 'file',
fieldname: 'files[]',
filename: 'def_upscayl_4x_ultrasharp.png',
encoding: '7bit',
mimetype: 'image/png',
file: FileStream {
_events: {
close: undefined,
error: undefined,
data: undefined,
end: [Function (anonymous)],
readable: undefined,
limit: [Function (anonymous)]
},
_readableState: ReadableState {
highWaterMark: 16384,
buffer: [Array],
bufferIndex: 0,
length: 64840,
pipes: [],
awaitDrainWriters: null,
[Symbol(kState)]: 1052940
},
_maxListeners: undefined,
bytesRead: 64840,
truncated: false,
_eventsCount: 2,
_read: [Function (anonymous)],
[Symbol(shapeMode)]: true,
[Symbol(kCapture)]: false
},
fields: { 'files[]': [Circular *1] },
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
}
<ref *1> {
type: 'file',
fieldname: 'files[]',
filename: 'SSL27917.JPG',
encoding: '7bit',
mimetype: 'image/jpeg',
file: FileStream {
_events: {
close: undefined,
error: undefined,
data: undefined,
end: [Function (anonymous)],
readable: undefined,
limit: [Function (anonymous)]
},
_readableState: ReadableState {
highWaterMark: 16384,
buffer: [Array],
bufferIndex: 0,
length: 64851,
pipes: [],
awaitDrainWriters: null,
[Symbol(kState)]: 1052940
},
_maxListeners: undefined,
bytesRead: 64851,
truncated: false,
_eventsCount: 2,
_read: [Function (anonymous)],
[Symbol(shapeMode)]: true,
[Symbol(kCapture)]: false
},
fields: { 'files[]': [Circular *1] },
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
}
But, when you change the limit to only 1024, which would be 1Kb everything is working fine. So, what could be the problem?!
Here are files I'm uploading, so you can see the're have matching file size:
Link to code that reproduces the bug
No response
Expected Behavior
No response
I'm a bit lost. You are reading only one file in your example. Can you add a complete example with scripts to reproduce?
MAX SIZE
I'm not sure it's the same exact issue, but the file size does not appear to be respected at all, or maybe i'm doing something wrong?
// generated by postman
curl --location 'https://tr.localhost/api/file/upload' \
--form 'metadata="{}";type=application/json' \
--form '=@"/Users/a/Desktop/trashBlob-lg.txt"'
➜ Desktop stat -f%z trashBlob-lg.txt
59419146
fastify.register(async (fastify) => {
await fastify.register(multipart, {
limits: {
fileSize: MAX_FILE_UPLOAD_SIZE,
files: 1, // Maximum number of files per request
fieldNameSize: 100, // Max field name size in bytes
fieldSize: Bytes.addKiB(100).toBytes(), // 100 KiB max field value size
fields: 1, // Max number of non-file fields
headerPairs: 2000, // Max number of header key-value pairs
},
});
fastify.post<{
Reply: IFile;
}>('/api/file/upload', {
preHandler: async (request) => {
const fileData = await request.file({
limits: {
fileSize: MAX_FILE_UPLOAD_SIZE, // 20MB
},
throwFileSizeLimit: true,
});
if (!fileData) throw new Error('No file uploaded');
const fields = Object.values(fileData?.fields ?? {}) as Multipart[] & { type: 'field'; fieldname: string };
let metadata: Record<string, unknown> | null = null;
for (const field of fields) {
if (field!.type !== 'field') continue;
if (field!.fieldname === 'metadata') {
if (field!.mimetype !== 'application/json') throw new Error('Metadata must be a JSON object');
if (!isPlainObject(field.value)) throw new Error('Metadata must be a plain object.');
metadata = field.value as Record<string, unknown>;
break;
};
}
if (!metadata) throw new Error('Metadata is required. When forming a request, ensure the metadata field is listed before the file field.');
const mimeType = fileData.mimetype;
if (!isSupportedMimeType(mimeType)) throw new Error('Unsupported mime type');
request.body = {
file: fileData.file,
mimeType,
metadata,
filename: fileData.filename,
};
},
handler: async (request) => {
const { principal, log } = request;
const body = request.body as { file: Readable; mimeType: string; metadata: Record<string, unknown>; filename: string };
const { file, mimeType, metadata, filename } = body;
return await transact(async ({ repositories }: UnitOfWork) => {
const fileService = new FileService({ repositories });
const result = await fileService.saveNewFile({
fileContent: file,
mimeType,
filename,
metadata,
ownerId: principal!.id,
creatorId: principal!.id,
creatorType: 'user',
createdByAction: FileService.CREATE_ACTIONS.userUpload,
logger: log as ILogger,
});
return result;
});
},
});
});
this is accepted without any problem...shouldn't it throw as soon as the max content length limit is breached?
MAX NUMBER OF FILES?
also, the max number of files doesn't seem to be respected? i've attached this twice but my request goes through and succeeds.
curl --location 'https://tr.localhost/api/file/upload' \
--form 'metadata="{}";type=application/json' \
--form '=@"/Users/a/Desktop/trashBlob-sm.txt"' \
--form '=@"/Users/a/Desktop/trashBlob-sm.txt"'
Is it because i'm only not explicitly reading the second file? I don't want to accept a request at all if the request is malformed. is there any way to optimize the validation without buffering the files completely?
Might be related but neither files() nor saveRequestFiles() support higher override limits, which are checked in the preValidation hook. So take care that no limits can be raised, only lowered, I believe. This part of the documentation is incorrect:
So preValidation only listens on to the global setting. Maybe it's a good idea to fix the documentation, since it's counter intuitive.
And if this lands https://github.com/fastify/fastify-multipart/pull/580 things might change.