Add support for data values schema without defaults
Describe the problem/challenge you have
Some data values don't have sensible defaults and I'd like template users to always provide them
Describe the solution you'd like
Maybe special value for @schema/default annotation , which would indicate that there is no deafult value at all? Schema is then used to define just type of the value, expecting template user to always provide it.
Anything else you would like to add:
Currently it can be simulated with assert, but it is too verbose.
Vote on this request
This is an invitation to the community to vote on issues, to help us prioritize our backlog. Use the "smiley face" up to the right of this comment to vote.
👍 "I would like to see this addressed as soon as possible" 👎 "There are other more important things to focus on right now"
We are also happy to receive and review Pull Requests if you want to help working on this issue.
Hey @redbaron 👋🏻
Yes! This is a use-case we're looking to tackle in an upcoming track of work: Schema Validations. (epic: #561) We're literally in the middle of pulling together the scope of that track, but here's the relevant part of the proposal feeding into that work: https://hackmd.io/pODV3wzbT56MbQTxbQOOKQ?view#schemavalidate
We're still working out the details, but this case could be handled like so:
#@schema/nullable
#@schema/validate not_null=True
connector: ""
That is:
- acceptable values are strings;
- the default value is
null; (@schema/nullable both allowsnulland sets the default value tonull) - after all data values have been set, this value cannot be nullable.
What do you think?
Great stuff! Why there is a #@schema/nullable annotation, though?
after all data values have been set, this value cannot be nullable.
cannot be null
Great stuff! Why there is a #@schema/nullable annotation, though?
because from a type system perspective there are 2 valid types: string OR null (vs other fields that may only support type string -- or some other type). however, we do not want to allow values to be null, hence the validation. this option however is not somethign we would recommend for most uses... (jump to next section)
Some data values don't have sensible defaults and I'd like template users to always provide them
id like to push back on this assumption -- can you provide some concrete examples.
example i can think of is asking users for aws secret key. the type of the key is string (it can only be string; it can never be null; in Go this would be type AWSSecretKey string, not pointer to string). the default is empty since there no useable default value; however, as an author you definitely want to validate this string to be non-empty. (validation today can be done directly via assert.fail or eventually via schema validations).
can you provide some concrete examples.
App name, passwords, references to other resources, API enpoints of dependencies, etc.
because from a type system perspective there are 2 valid types: string OR null (vs other fields that may only support type string -- or some other type). however, we do not want to allow values to be null, hence the validation.
Seems that, you do it in validation because type system is not expressive enough. Allowing null at the type level , just to reject in in validation later is a bit confusing, but maybe it is a good compromise.
Maybe extending type system can help to simplify this corner case (not sure it is worth it, just throwing an idea):
type StringValue struct {
IsNull bool
Value string
}
then data value internally becomes following:
var storesValue = &StringValue { Value: "xyz" }
var storesNull = &StringValue { IsNull: true }
var storesNothing *StringValue = nil
allowing you distinguish betwen 3 possible states:
- variable was assigned a value
- variable was assigned null
- variable wasn't assigned anything
type system is not expressive enough.
that's exactly my point though, in reasonably powerful type system above is myType = string | null as a combined type -- which is exactly what nullable provides as syntactic sugar (type can be null or x). but... im argument is that you do not need that, what you really want is string & len() > 0, which is what im advocating for and to represent that you specify type=string, validate=non_empty.
im argument is that you do not need that, what you really want is string & len() > 0
OK got it, you are saying that it is possible to bypass null state and just check the value directly. Generally speaking what I want is to require user to provide variable value. We can approximate it by checking that final value, after all overlays computed is not equal to default , this wont catch case when user provides a value equal to default, but I can think where it can become a problem.
I can see it workig for strings, but what about numbers? validate not_eq=-1 , where -1 is "impossible" value for this particular variable (some other variables might use 0 for that)?
How this would work for non scalar variables? Lets say I expect user to provide a struct (map with known keys), do I resort to nullable & validate=not_null? I'd be leaning towards doing latter for all types, even scalars, simply for consitency reasons.
Also, what error message user is going to see? If it comes from validation, it'll say that value doesn't pass validation, which is not as good as "please provide this value", especially as app evolves and users apply new version using old values file.
Overall, I can see how validation can be used to approximate required variables and it is great to have this feature, however having dedicated @schema/required annotation would be better IMHO, because:
- variables can be marked as required exactly the same, regardless of their type
- better error messages when user need to take some action
- it is easier to document and explain
If you keep current type system same, required annotation can internally be implemented as a combination of "nullable & validate=not_null".
I'm getting, @redbaron, that the key point here is having a quality UX... both for those who are reading the schema, itself; and for those who get the error messages from running with such a schema. 👍🏻
I'm thinking we can get there from here with some thoughtful choices:
As @cppforlife pointed out, above, authors will want to specify constraints anyway. If we can craft error messages around zero-values that could read well for both cases, that might be a good balance (given that adding both the specific constraint/validation and @schema/required would make for rather verbose schema).
Example:
#@data/values-schema
---
appName: ""
#@schema/nullable
#@schema/validate not_null=True
db_conn:
host: ""
#@schema/validate min=1025 max=65535
port: 0
username: ""
password: ""
db_name: ""
where:
appName- by default, strings will have the validation:
@schema/validate min_len=1; - the
min_lenvalidation could provide a special-case error message for when len(val) == 0:- "Specify a value for 'appName', it is empty."
- "A value for 'appName' is required; it is empty."
- by default, strings will have the validation:
db_conn- I can't think of a validation that means "empty" other than to set it to
null(and then assert it can't benull). - the
not_nullvalidation could likewise yield an error message that reads very close to "required"- "A value for 'db_conn' is required; it is null."
- "'db_conn' is null, but a non-null value is required."
- I can't think of a validation that means "empty" other than to set it to
More generically:
- for things for which a length can be taken (strings, arrays) when len=0 and min_len=>1 (will be default for strings), the error message can use the adjective "empty"... and perhaps even use the word "required"
- for all other cases (e.g. including integer values), we'd — at a minimum — encourage the use of
@schema/nullable+@schema/validate not_null=True. Given how common that can be, we can provide (as you suggested, @redbaron) some syntax sugar.@schema/unset@schema/undefined@schema/not-nullable- ... ?
I am facing same situation again, started to google just to find my own issue :)
Another idea might be to allow type to be set by annotation:
#@schema/type string
password: null
here we declare type, but provide impossible (because there is no @schema/nullable) value, forcing user to set it in passed data values.
Thank you for the bump, @redbaron.
Great news: release v0.43.0 is eminent — in it, the Data Values Validation feature will become Generally Available.
Your use case: wanting to require the end-user to supply a Data Value is the top use case. Here's the relevant docs in draft.
We're in the final stages; can't commit to a timeline, but we're talking days, not weeks.
If you wanted to kick the tires right now, much of this functionality is in experimental mode in v0.42.0. Check out our blog post that announced the preview.
EDIT: There was a typo in the annotation name (validate → validation). Now it works. Do we need to validate annotations as well? :-D
Discard the rest.
I was also looking into that and think it is worth reporting that it doesn't seem to work for me.
#! base.yaml
#@data/values-schema
---
helm:
#@schema/nullable
namespace: ""
environment:
#@schema/nullable
#@schema/validate not_null=True
id: ""
#! env.yaml
#@data/values
---
helm:
namespace: default
❯ YTTEXPERIMENTS=validations ytt version
ytt version 0.42.0
- experiment "validations" enabled.
❯ YTTEXPERIMENTS=validations ytt -f base.yaml -f env.yaml --data-values-inspect
helm:
namespace: default
environment:
id: null
I'd expect an error, because environment.id annoted with @schema/validate not_null=True.
Thanks for the report @Zebradil.
EDIT: There was a typo in the annotation name (validate → validation). Now it works Do we need to validate annotations as well? :-D
Yeah, we've had (short) conversations about this in the past: #114
There's some ambiguity left in that conversation; worth circling back to see if we can't have a more consistent+cohesive approach (and better user experience).