Inconsistent behaviour of "arrayLimit"
From readme:
qs will also limit specifying indices in an array to a maximum index of 20.
Any array members with an index of greater than 20
will instead be converted to an object with the index as the key
Value 20 is configurable by arrayLimit option.
First of all, arrayLimit as a name for the option is very confusing. I would expect that resulting array will contain no more than arrayLimit items and the rest items will be rejected. However, this is not correct. All values will be parsed but instead of an array I will get an object.
Ok, but what's the problem with arrayLimit. Actually it works only with array defined using bracket form:
ids[]=43750&ids[]=70936&ids[]=94691&...
If I have more than 21 ids in a query string, the resulting value will be parsed as an object:
{
'0': '43750',
...
}
However, if array is defined without brackets, arrayLimit will not be honoured:
ids=43750&ids=70936&ids=94691&...
If I have for example 30 ids in a query string, the resulting value will be parsed as an array:
['43750', '70936', '94691', ...]
If there's no brackets, it's not necessarily an array - the server is free to treat it as "last one wins", "first one wins", or "an array". If it's important it be an array, you should always use the brackets.
I'm not sure what the request is here - the parameter name is what it is. Are you suggesting a rename?
By default qs is considering the following ids=43750&ids=70936&ids=94691 as an array. That's why I think that qs should apply arrayLimit also to array notation without brackets. This is where I see inconsistency mentioned in the topic.
I created the issue because I could not understand why my two express applications behaves differently when they receive an array in query string (qs is used as a default parsing module in express). One returned an array, but the other returned an object with numeric keys. And the problem was in a way how arrays were written in query string (with and without brackets).
Regarding the option name (arrayLimit). Actually it means absolutely different thing. I read that this is like protection from DOS, but it does not protect from DOS because it does not limit input data, it just parses an array as an object (why is it needed?).
Ah, thanks for clarifying.
The reason it’s needed is because you can do a[0]=1&a[99999999999999999999999]=2 and create a very large array. This problem doesn’t occur without brackets.
With a[0]=1&a[99999999999999999999999]=2 it makes sense, yes 👍
Would you want to submit a PR improving the docs to make it more clear why the option is needed?
@ljharb I've submitted the PR. I intentionally specified that the problem is only with iteration over the array (and this iteration happens in qs itself). There are no other issues.
I mean, the same issue would occur with anyone iterating over the array with a method that didn’t skip holes, so it’s not just internal to qs.
From documentation:
When creating arrays with specific indices, qs will compact a sparse array to only the existing values preserving their order:
var noSparse = qs.parse('a[1]=b&a[15]=c');
assert.deepEqual(noSparse, { a: ['b', 'c'] });
It says that qs compacts an array and removes holes. So user always receives already "sanitised" array, and all iterations over this array on client side are fast.
ah, fair enough.
Honestly I think arrayLimit can be removed in near feature. All arrays handled in qs can be wrapped in Object.values, Object.entries where needed. These methods do not iterate over arrays, they return result immediately skipping not initialized (empty) values. The only problem is that they are supported only starting from node 7.
we’d add a dependency on the object.values or object.entries package in that case; or use a for..in loop.