tac
tac copied to clipboard
Adjust requirements for the OpenSSF Badge at the Adopted Stage
Based on the discussion during the 2023-11-29 TAC Meeting ( agenda item #502 ), this addresses two changes discussed:
- Change from requiring the achievement of the OpenSSF Best Practices badge at the Gold level to having a large portion of the badge completed. I put in 75% for now, but I expect the group to refine.
- I put that the OpenSSF Best Practices badge at the Gold level should be complete by the next Annual Review after the project reaches the Adopted Stage; again, the TAC can determine if one year is enough.
Separately, I will work on a table that outlines how to complete the OpenSSF Best Practices badge requirements.
Comment away, everyone :-)
As a point of discussion, here's a spreadsheet which aggregates the current badging state for ASWF projects:
https://docs.google.com/spreadsheets/d/1n8xEdbJ77fVk5YxtuqjC7KZywi0W7ZfXlGf0YjVZI9Q/edit#gid=1274673236
- No project has yet achieved Silver, with MaterialX and OpenEXR being the closest
- OpenEXR, OpenColorIO, MaterialX, OpenVDB, OpenImageIO, OpenCue, OSL, OpenAssetIO have achieved Passing OpenTimelineIO is pretty close to achieving Passing
- Some requirements are present at more than one badge level, for instance a requirement might be a "SHOULD" for Passing, but a "MUST" for Silver: the API only returns a single result for a requirement and doesn't indicate which badge it applies to. There are only a few such "cross badge" requirements (4 or 5), so that shouldn't skew the results too much.
For discussion at the next TAC meeting, we have outlined seven requirements based on projects completing the badge requirements that should be temporarily omitted from the requirements until there are cleared paths to address them.
Rationale: A preferred approach is to start with doing a threat model analysis, which could help projects improve their security posture before doing a security audit in the future. Work on this to be tracked at #615
Rationale: This would best come together with the threat model analysis discussed above. Work on this to be tracked at #615
Rationale: The plurality of projects have complex hardware needs to get to this level of coverage. Work to be done on how to address this.
Rationale: More analysis to be done. If the project is not distributing binaries, technically, this requirement is N/A, but it is desired to see how at least the common build patterns are reproducible.
The GitHub Pages and Readthedocs seem to have some issues here; work to be done to fix this ( see https://github.com/coreinfrastructure/best-practices-badge/issues/1878 )
ACTION: TAC to review and approve.
The current analysis across all projects is at https://docs.google.com/spreadsheets/d/1bEacUNFizeT8QtfsvqiRNNgvty8_tweHjassHko6OhQ/edit?usp=sharing
For the 2 points about test coverage:
- the wording about about "FLOSS tool" may be a bit too restrictive: a lot of interesting tools have a "free for open source" offering but aren't FLOSS themselves, for instance SonarCloud which is used by a number of projects
- rather than setting arbitrary branch and instruction coverage values. maybe restart as "a commitment to improving branch and instruction coverage over time"?
For "reproducible builds": we should separate the steps of having a "reproducible build process" (which most projects provide through their CI) from the more stringent "reproducible build" requirement of being able to produce a bit for bit exact artifact which none of our projects are able to achieve (and may not be possible in many cases). It may be sufficient to clearly define the scope of what is meant by "reproducible builds" to allow most projects to meet that requirement by having a CI in place.
For the project website: adding a clause about "widely used web hosting infrastructures such as GitHub Pages or ReadTheDocs" could allow projects to tick off that slightly modified requirement?
Good points @jfpanisset - comments on two of your points...
For the 2 points about test coverage:
- the wording about about "FLOSS tool" may be a bit too restrictive: a lot of interesting tools have a "free for open source" offering but aren't FLOSS themselves, for instance SonarCloud which is used by a number of projects
I think the intention here is to not burden a project to have to purchase/depend on commercial tools, which, even if there is open source pricing available, can still be prohibitive. If anything, the restriction is probably more relief than a burden to the projects; if there aren't FLOSS tools available for the language, that means that code in that language won't count towards any coverage requirements.
For the project website: adding a clause about "widely used web hosting infrastructures such as GitHub Pages or ReadTheDocs" could allow projects to tick off that slightly modified requirement?
I think something of this nature is being added to the additional description section of that requirement at some point based on the discussion of the issue thread I mentioned
the wording about about "FLOSS tool" may be a bit too restrictive: a lot of interesting tools have a "free for open source" offering but aren't FLOSS themselves, for instance SonarCloud which is used by a number of projects
The requirements are about FLOSS test suites, not FLOSS coverage tools. In other words, projects should not rely on a closed source/proprietary test runner for example. SonarCloud doesn't generate the coverage data, it's just used to store and display the coverage results.
For "reproducible builds": we should separate the steps of having a "reproducible build process" (which most projects provide through their CI) from the more stringent "reproducible build" requirement of being able to produce a bit for bit exact artifact which none of our projects are able to achieve (and may not be possible in many cases). It may be sufficient to clearly define the scope of what is meant by "reproducible builds" to allow most projects to meet that requirement by having a CI in place.
A project is either reproducible or it's not. Providing CI should not count as reproducible. As for bit-for-bit, I wouldn't say that our projects are not able to. I have not heard of a project that tried to see if their builds were reproducible. And don't forget that Python projects also need to be reproducible (because yes, that's a thing too in Python).
In general, I think I like's @jmertic statement:
Rationale: More analysis to be done. If the project is not distributing binaries, technically, this requirement is N/A, but it is desired to see how at least the common build patterns are reproducible.
So before relaxing, mark it as "need investigation". I would change "If the project is not distributing binaries, technically, this requirement is N/A" to say that a zip file or a tar file (for example packages on PyPI) do count as binaries because they do have to be reproducible. If they are not, Linux distributions will complain if our python projects get distributed by them.
Based on TAC feedback, we should refine the specific requirements for the Adopted Stage not to require the OpenSSF Best Practices at the Gold Level badge. Instead, we can pull in the most common requirements and then encourage projects to achieve the badge. Action is for a group to help review/refine.
Just a few additional items (besides the ones listed above) that in re-reading today, I'm still unsure exactly how to satisfy. I'm not saying they should be dropped, but at least we should give some explicit guidance about what to do, or ASWF policy about what exactly constitutes satisfying them (ideally with a working example in one project):
-
The project (both project sites and project results) SHOULD follow accessibility best practices so that persons with disabilities can still participate in the project and use the project results where it is reasonable to do so.
-
The project MUST list external dependencies in a computer-processable way.
-
The project results MUST check all inputs from potentially untrusted sources to ensure they are valid (an allowlist), and reject invalid inputs, if there are any restrictions on the data at all.
@lgritz - those all seem reasonable to exclude. Note that the one is a SHOULD, so whether we include it or not probably isn't a big deal either way.
I think for this one...
The project MUST list external dependencies in a computer-processable way.
Any python project using requirements.txt
or the like would be fine here. For C/C++ projects, I believe CMake somewhat has a way to address external dependencies IIRC.
Approved during the TAC meeting on 20240626 - will have approved by the GB via email