fix: add FST_ERR_CTP_INVALID_JSON_BODY
While working at fastify-compress I realized, that our default content type parser for json is not returning fastify errors.
See the code in https://github.com/fastify/fastify-compress/pull/342
This also avoids, that we get a potential reflection attack.
How about avoiding creating stacktraces like this?
function defaultJsonParser (req, body, done) {
const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
try {
if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
}
const json = parse(body, parserOptions)
done(null, json)
} catch {
return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
} finally {
Error.stackTraceLimit = stackTraceLimit
}
}
@mcollina @metcoder95 Just for completeness: This could slow down the response time by fastify on invalid json payloads by about 90%! I would actually recommend to think about my proposol above about avoiding the stack traces.
Just for completeness: This could slow down the response time by fastify on invalid json payloads by about 90%! I would actually recommend to think about my proposol above about avoiding the stack traces.
What's the performance difference with the stack trace limitation?
Benchmark case:
'use strict'
const fastify = require('../../fastify')({
logger: false
})
fastify
.post('/', function (req, reply) {
reply
.send({ hello: req.body.hello })
})
fastify.listen({ port: 3000 }, (err, address) => {
if (err) throw err
})
sending valid json
autocannon -m POST -H content-type=application/json -b '{"hello":"world"}' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 0 ms │ 1 ms │ 0.02 ms │ 0.21 ms │ 30 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Req/Sec │ 14,583 │ 14,583 │ 23,055 │ 23,535 │ 22,300 │ 2,465.92 │ 14,583 │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Bytes/Sec │ 2.74 MB │ 2.74 MB │ 4.34 MB │ 4.42 MB │ 4.19 MB │ 463 kB │ 2.74 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
245k requests in 11.01s, 46.1 MB read
sending invalid json on main branch:
aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms │ 1 ms │ 0.09 ms │ 0.34 ms │ 18 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Req/Sec │ 11,007 │ 11,007 │ 17,039 │ 17,279 │ 16,420 │ 1,730.68 │ 11,002 │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Bytes/Sec │ 3.48 MB │ 3.48 MB │ 5.39 MB │ 5.46 MB │ 5.19 MB │ 547 kB │ 3.48 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
0 2xx responses, 180634 non 2xx responses
181k requests in 11.01s, 57.1 MB read
this branch
aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms │ 1 ms │ 0.18 ms │ 0.47 ms │ 30 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬─────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Req/Sec │ 8,111 │ 8,111 │ 13,423 │ 14,031 │ 12,879.82 │ 1,588.2 │ 8,110 │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Bytes/Sec │ 2.77 MB │ 2.77 MB │ 4.58 MB │ 4.78 MB │ 4.39 MB │ 541 kB │ 2.77 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴─────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
0 2xx responses, 141664 non 2xx responses
142k requests in 11.01s, 48.3 MB rea
with my suggested change:
function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
const parse = secureJson.parse
const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }
return defaultJsonParser
function defaultJsonParser (req, body, done) {
const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
try {
if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
}
const json = parse(body, parseOptions)
done(null, json)
} catch {
return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
} finally {
Error.stackTraceLimit = stackTraceLimit
}
}
}
aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms │ 1 ms │ 0.1 ms │ 0.34 ms │ 20 ms │
└─────────┴──────┴──────┴───────┴──────┴────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Req/Sec │ 10,951 │ 10,951 │ 17,039 │ 17,375 │ 16,289.1 │ 1,737.52 │ 10,944 │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Bytes/Sec │ 3.73 MB │ 3.73 MB │ 5.81 MB │ 5.92 MB │ 5.55 MB │ 592 kB │ 3.73 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
0 2xx responses, 179183 non 2xx responses
179k requests in 11.01s, 61.1 MB read
Alternative:
const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
const defaultInvalidJSONError = new FST_ERR_CTP_INVALID_JSON_BODY()
Error.stackTraceLimit = stackTraceLimit
function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
const parse = secureJson.parse
const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }
return defaultJsonParser
function defaultJsonParser (req, body, done) {
if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
}
let json
try {
json = parse(body, parseOptions)
} catch (err) {
err.code = 'FST_ERR_CTP_INVALID_JSON_BODY'
err.statusCode = 400
err.message = "Body is not valid JSON but content-type is set to 'application/json'"
Object.setPrototypeOf(err, defaultInvalidJSONError)
return done(err, undefined)
}
done(null, json)
}
}
aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms │ 1 ms │ 0.11 ms │ 0.36 ms │ 20 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬─────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Req/Sec │ 9,639 │ 9,639 │ 16,607 │ 17,055 │ 16,014.91 │ 2,031.5 │ 9,633 │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Bytes/Sec │ 3.28 MB │ 3.28 MB │ 5.66 MB │ 5.82 MB │ 5.46 MB │ 694 kB │ 3.28 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴─────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
0 2xx responses, 176182 non 2xx responses
176k requests in 11.01s, 60.1 MB read
yeah, dont worry.
with my suggested change:
I'm ok with the suggestion; secure-json-parse seems to already remove stack trace upon parsing so the regression seems to be mostly due to error generation.
I'd say that possibly is not a hot path per-se, so the regression from your stack trace suggestion should be fine.
I am thinking is it better to do with
fastify-errorand provides an option to hide stack trace.// per-error createError(code, message [, statusCode [, Base]] [,hideStackTrace]) // globally const createError = require('@fastify/error') createError.hideStackTrace = trueThe global option is used to hide stack trace for all fastify error in production. The per-error option is used to hide or unhide stack trace where we think necessary.
The global option is needed because we always provide the detail of stack trace by default. It would be better to hide the detail in production mode.
LGTM!
How should we move on?
If @climba03003's suggestion has the same benefits, I'd say that his approach will be a good way to move forward
@Uzlopak would you like to rebase and finish this up?
I think we can just set captureStackTrace to false for all validation errors?
@gurgunday
Sorry, but is this now acceptable? It is so long time ago I touched this.
Oh, ok. Should be ready now ;)