Internet.nl
Internet.nl copied to clipboard
When to "grey out" subtests
It seems that we're currently not consistent in greying out subtests.
Example:
- in the e-mail test we grey out 'DMARC policy' if the 'DMARC existence' fails.
- in the e-mail test we grey out 'DANE validity' and 'DANE roll scheme'.
- in the e-mail test we don't grey out '0-RTT' if TLS 1.3 is not supported by the mail server. We just pass the test.
In an e-mail (26 september 2019 13:58) GT states:
I think we are not consistent by greying-out the 0-RTT subtest when TLS1.3 is not available.
Greying out a subtest means we couldn't test it and you get penalized for it, not that it doesn't apply to your case and we ignore the subtest.
For the 0-RTT case the test should be positive because we checked your server and you don't support 0-RTT which is a positive result.
The green or gray icon provides feedback for the result of the test, not for the availability of the feature.
I think it's confusing for our users and we should consider changing / simplifying our approach.
#378 Related
This issue and #703 should be picked up as the skipped test results are sometimes misinterpreted in reports that use Internet.nl for measurements.
Visually, skipped tests for 'positive' reasons (e.g., no mail server on a non-mailing domain name) should be distinctive from skipped tests for 'negative' reasons (e.g. skipping the DMARC policy test when DMARC does not exist).
Technically, the good and bad skips should also be distinguishable, for instance by using different labels as proposed in #703
So in the API the zero_rtt
can be good
, bad
or na
, but na
is just ✅ 0-RTT now.
It would help if it would be more distinguishable.