BOSL2
BOSL2 copied to clipboard
Review docs for correctness
Need to verify that the docs are correct and match the code.
This issue is inactive for more than 4 years and have received little attention. Anyway, I believe documentation is important and should be cared for.
Reviewing all of the documentation is probably a team effort as I don't think anybody except maybe the main maintainers know every aspect of the library.
I guess that the obvious idea is to provide a table with all the documented functions and let people add their name to the correct cell when they have reviewed a function, along with the hash of the master branch for which the review occurred.
However, because the documentation is very well structured, I'd also like to propose another complementary approach, which consists of digging programmatically through the documentation. Since Python is already present in the codebase, I think this would be the language of choice. I prepared a sketch in this regard on https://github.com/shepard8/BOSL2/tree/90-doc-checks.
Basically, executing python3 scripts/doc_crawler/test_documentation.py produces the following output (cut for brevity):
(...)
./bottlecaps.scad : Module sp_cap:
OK: Description is present
OK: Description is not empty
OK: At most one description
OK: Synopsis is present
./bottlecaps.scad : Function sp_diameter:
OK: Description is present
OK: Description is not empty
OK: At most one description
OK: Synopsis is present
Statistics by file
(...)
Summary for ./bottlecaps.scad: 120 successes, 0 failures.
Statistics by check
ID Severity Success Failure Message
ConCode.1 needed 17 0 Constant is defined after documentation
ConCode.2 needed 16 1 Correct constant is defined (`NAME = `)
Desc.1 needed 793 0 Description is present
Desc.2 needed 793 0 Description is not empty
Desc.3 needed 779 14 At most one description
Syn.1 commonality 778 15 Synopsis is present
Syn.2 needed 793 0 Synopsis is not empty
Syn.3 needed 780 13 At most one synopsis
Syn.4 needed 793 0 Synopsis is single line
Usage.1 recommendation 755 21 Usage is present
Usage.2 needed 776 0 Usage is not empty
Usage.3 needed 751 25 At most one usage block
Statistics by severity
Severity.needed : 53
Severity.recommendation : 21
Severity.commonality : 15
Summary: 7824 successes, 89 failures.
The general idea is to provide a list of checks, which will be executed on each Constant/Function/Module/Module&Function. Each check has a severity (Needed, Recommended, Common use).
Additionally, the run can be parameterized with --severity S, --check ID, --file F, --hide_successes; with, I hope, obvious effect.
Importantly, this will allow to check that
- the arguments are in tune with the function/module definition,
- the examples compile,
- the references (see also, topics, etc.) exist,
- the definition comes just after the documentation.
Should I continue this effort or does it make little sense in your perspective for any reason?
Best regards,