json-schema-vocabularies
json-schema-vocabularies copied to clipboard
Support maxDepth for object types
If a field is specified as an object it'd be great to have a maxDepth constraint that ensures the object does not exceed n nested objects. Example:
{
"type": "object",
"maxDepth": 2
}
Passes:
{
"key": {
"sub_key": "value"
}
}
Fails:
{
"key": {
"sub_key": {
"sub_sub_key": "value"
}
}
}
@binarylogic so this is a case where you are not specifying any particular properties, just a nesting level?
BTW if you want a workaround right now it would be:
{
"type": "object",
"additionalProperties": {
"type": "object",
"additionalProperties": {
"type": "object",
"additionalProperties": false
}
}
}
obviously that is substantially less readable! :-P
Ah, that is a clever solution. I'll use that for now. Thanks!
Yes, this is a case where we're accepting arbitrary objects but we want to impose horizontal (depth) and vertical (properties) limitations.
Interesting... I'll add to ajv-keywords ;) I guess there should be minDepth too in this case...
maxDepth and minDepth are tricky because they require tracking the position of the current location in the instance relative the first match of the the subschema on the containing object. We don't currently have validation assertion keywords that require that kind of state management.
So this could be done either by doing a special traverse checking depth when this is encountered, or by "remembering" that it is in effect and tracking depth while processing the child instances. Both seem to be a paradigm shift in validation implementation.
I don't know if I like this.
But I think there's a sensible way implementations could handle this. You just need to deincrement the effective value of the "maxDepth" keyword in the nested object, until it hits 0, which is when the validation would fail.
Moving keyword proposals without clear immediate support into the vocabularies repository.
In this case, I'm not sure this keyword really fits the overall processing model, but that can be discussed in the other repo.
So this could be done either by doing a special traverse checking depth when this is encountered, or by "remembering" that it is in effect and tracking depth while processing the child instances. Both seem to be a paradigm shift in validation implementation.
This isn't so different from the requirement that the enum and const keywords need to do deep traversals of objects and arrays to perform an equality check.
I really like the proposal of a maxDepth keyword (and I'd add minDepth just for symmetry, as it could also be useful sometimes) and would make use of it personally.
IMO it should also apply to array instances.