Update `kubernetes-client` organization to allow Github Actions to create pull requests
Organization or Repo
kubernetes-client
User affected
all users
Describe the issue
I think in the move to GH enterprise, an organization permissions change occurred which is preventing our github actions in the kubernetes-client/java repository (and probably all other repositories) from sending PRs. In particular, this is blocking our ability to regenerate new code.
The box in the repository settings to enable this for a github action is greyed out, and I believe this is because the organzational settings are restricted as described here:
https://docs.github.com/en/organizations/managing-organization-settings/disabling-or-limiting-github-actions-for-your-organization
Can we update the kubernetes-client organization to allow github actions to send PRs?
Thanks!
/assign @kubernetes/owners
Friendly ping on this one since it is blocking our ability to generate the 1.31 client.
Also for kubernetes-sigs as well please! these actions used to work and don't any more
- https://github.com/kubernetes-sigs/provider-aws-test-infra/actions
- https://github.com/kubernetes-sigs/hydrophone/actions
cc @Priyankasaggu11929 @cblecker
Friendly Friday ping on this. Thanks!
Hey @brendandburns @dims !
I apologize that this response was delayed, and that this change disrupted existing workflows without appropriate notice. You are correct that when we migrated all the orgs into the enterprise, the enterprise-wide policies started taking effect. For other orgs, this setting had already been disabled, but it seems it wasn't on the @kubernetes-client org.
The GitHub Admin Team, along with the SRC, has previously identified some issues with how GitHub Actions was being used in ways that created potential security concerns. As a result, one of the security harding measures that was implemented was disabling the ability for GitHub Actions to do certain write actions including creating PRs and approving code to merge. It seems that this setting wasn't ubiquitously applied across all orgs, and when we brought everything into the enterprise and it started to be enforced, broke a couple workflows including the one you describe above.
At this time, we are not looking at reverting the change to the setting as it weakens the project's security posture. We would be happy to collaborate on alternatives, such as prow jobs, to enable this type of functionality without the use of GitHub Actions.
@cblecker we currently use github actions to regenerate code from Kubernetes Swagger. What is the current approach that you suggest? I'm not sure this is something that prow really wants to take on (nor am I sure that is the right solution)
And imho restricting this setting doesn't add any additional security, because I can just add my own GITHUB_TOKEN to my github action's secrets and push PRs using my credentials (which is arguably way less secure) so I'm not sure that this accomplishes much.
I wanted to also clarify that these actions aren't run on PRs they are manually triggered and they can only be triggered by the repo owners, so I think that some of the security concerns don't apply.
@cblecker friendly ping on this as it is blocking/making more difficult the generation of Kubernetes clients.
fwiw, this is also blocking the ability to automatically send PRs back to the repository after we cut a release.
@cblecker we really need to revisit this policy as it is blocking necessary automation. There's a lot of tasks for which prow is not the right solution.
Actually, this is worse than that, it is actually blocking our ability to cut a release at all:
https://github.com/kubernetes-client/java/actions/runs/11262193200/job/31317326994
Moving to a manual process for releasing this client is a bad, less secure, idea. Please advise about how you think we should do this.
@cblecker @Priyankasaggu11929 @jasonbraganza
Who is the right person to work with on this policy. It really restricts the utility of Github Actions, and it doesn't make sense to move to other CI/CD.
Hello @brendandburns, extremely sorry for the silence.
I don't have an answer just yet. But I'm acknowledging your above comments (on behalf of GitHub Admins). I have re-started a discussion within the GitHub Admin group. I'll post updates here again in next week.
Hello @brendandburns, will the following approach help unblock the failing workflows?
ref: https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/controlling-permissions-for-github_token#overview
You can use `permissions` to modify the default permissions granted to the GITHUB_TOKEN, adding or removing access as required, so that you only allow the minimum required access.
It was pointed out in another similar report of GH Action workflows breaking after enterprise move and helped fixing the permissions issue. (ref: slack thread).
We've also reached out to GitHub staff with a new feature request to allow more granular management of GitHub Actions permissions at the repo/org level, in an GH enterprise setup (as of now, it is not possible).
@Priyankasaggu11929 I will try that, but given that apparently the policy is blocking that permission:
https://docs.github.com/en/organizations/managing-organization-settings/disabling-or-limiting-github-actions-for-your-organization#preventing-github-actions-from-creating-or-approving-pull-requests
I would be surprised if it works (and honestly, if it does work, it's probably a security issue for GH)
I would be surprised if it works (and honestly, if it does work, it's probably a security issue for GH)
Yes, I agree using permissions: is not the best way here.
Though, checked the scope of permissions for Automatic GITHUB_TOKEN, it seems scoped to the repository containing the workflow and the token expires when the job finishes or after a maximum of 24 hours.
Regardless, we're discussing (within GH Admin group) auditing the use of permissions: in Kubernetes GH repos.
We've also reached out to GitHub staff with a new feature request to allow more granular management of GitHub Actions permissions at the repo/org level, in an GH enterprise setup (as of now, it is not possible).
Quick update – GitHub has accepted the above request, and hopefully we'll have the feature/workaround by early next year Q1. cc: @mrbobbytables
/reopen
We have the same issue in kubernetes-sigs/controller-tools.
@brendandburns Did https://github.com/kubernetes-client/java/pull/3778 resolve your issue? (I think it didn't but the job logs are not availalbe anymore: https://github.com/kubernetes-client/java/actions/workflows/generate.yml)
We already have the permission in our action in controller-tools (https://github.com/kubernetes-sigs/controller-tools/blob/main/.github/workflows/tools-releases.yml#L12C8-L12C16) and it's not enough in our case (https://github.com/kubernetes-sigs/controller-tools/actions/runs/12463704971).
So overall sounds like we're waiting for the GH enterprise feature that potentially resolves this.
@sbueringer: Reopened this issue.
In response to this:
/reopen
We have the same issue in kubernetes-sigs/controller-tools.
@brendandburns Did https://github.com/kubernetes-client/java/pull/3778 resolve your issue? (I think it didn't but the job logs are not availalbe anymore: https://github.com/kubernetes-client/java/actions/workflows/generate.yml)
We already have the permission in our action in controller-tools (https://github.com/kubernetes-sigs/controller-tools/blob/main/.github/workflows/tools-releases.yml#L12C8-L12C16) and it's not enough in our case (https://github.com/kubernetes-sigs/controller-tools/actions/runs/12463704971).
So overall sounds like we're waiting for the GH enterprise feature that potentially resolves this.
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.
Adding us to this issue so that we can see when there's an update. LWKD is also affected by this.
Is anyone aware of any updates from the GitHub side?
Is anyone aware of any updates from the GitHub side?
"Enterprise custom properties, enterprise rulesets" are GA now . Though, GH Admins are yet to test it on our Kubernetes enterprise account. I'm assuming this will be after KubeCon EU 2025 now, but sharing as FYI [1] [2].
[1] https://github.blog/changelog/2025-03-24-enterprise-custom-properties-enterprise-rulesets-and-pull-request-merge-method-rule-are-all-now-generally-available/ [2] https://docs.github.com/en/enterprise-cloud@latest/admin/enforcing-policies/enforcing-policies-for-your-enterprise/managing-policies-for-code-governance
Any thing we can do from outside GH admins team to help move this forward ?
Is anyone aware of any updates from the GitHub side?
"Enterprise custom properties, enterprise rulesets" are GA now .
Update - with above, it is still not possible to provide granular permissions for gh actions for selected repos. we have passed the feedback to GitHub.
Do we have any hope that GitHub will address this eventually? :)
(I'm asking, because if not, I'll look into options to reduce toil caused by this for maintainers on our side)
The Kubernetes project currently lacks enough contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle stale - Close this issue with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale
/remove-lifecycle stale
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle rotten - Close this issue with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle rotten
/remove-lifecycle rotten