Storage Object Get Access Error
TL;DR
When using the cloudbuild workflow, the action job reports as failing due to a storage access error. However the job is correctly triggered in cloudbuild and completes successfully.
My problem seems similar to the one described in https://github.com/GoogleCloudPlatform/github-actions/issues/49, but as I needed some clarification I opened this issue.
Expected behavior
The workflow would complete successfully
Observed behavior
The error message:
Build and push image to Google Container Registry(4s)
Run gcloud builds submit \
gcloud builds submit \
--quiet \
--tag "gcr.io/$PROJECT_ID/$REPOSITORY_NAME:$GITHUB_SHA"
shell: /bin/bash -e {0}
env:
PROJECT_ID: ***
CLOUDSDK_CORE_PROJECT: ***
REPOSITORY_NAME: ***
CLOUDSDK_METRICS_ENVIRONMENT: github-actions-setup-gcloud
Creating temporary tarball archive of 148 file(s) totalling 8.9 MiB before compression.
Some files were not included in the source upload.
Check the gcloud log [/home/runner/.config/gcloud/logs/2020.05.15/21.07.45.085431.log] to see which files and the contents of the
default gcloudignore file used (see `$ gcloud topic gcloudignore` to learn
more).
Uploading tarball of [.] to [gs://***_cloudbuild/source/1589576865.25-e65b89df2a91419fbff076630958d5ee.tgz]
Created [https://cloudbuild.googleapis.com/v1/projects/***/builds/59a1f2ff-beee-4f1a-8147-504efe4014fd].
Logs are available at [https://console.cloud.google.com/cloud-build/builds/59a1f2ff-beee-4f1a-8147-504efe4014fd?project=192068846044].
ERROR: (gcloud.builds.submit) HTTPError 403: <?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>*****@*****.iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object.</Details></Error>
##[error]Process completed with exit code 1.
Following the logs link I can see that everything ran fine in spite of the error:
Reproduction
Action YAML
name: ci
on:
pull_request:
types:
- opened
- synchronize
- reopened
push:
branches:
- master
tags:
- "[0-9]+.[0-9]+.[0-9]+"
env:
PROJECT_ID: ${{ secrets.PROJECT_ID }}
CLOUDSDK_CORE_PROJECT: ${{ secrets.PROJECT_ID }}
jobs:
check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Retrieve the repository name
run: echo ::set-env name=REPOSITORY_NAME::$(echo "$GITHUB_REPOSITORY" | awk -F / '{print $2}')
shell: bash
- name: setup gcloud CLI
uses: GoogleCloudPlatform/github-actions/setup-gcloud@master
with:
service_account_key: ${{ secrets.GCP_SA_KEY }}
project_id: ${{ secrets.PROJECT_ID }}
- name: Build and push image to Google Container Registry
run: |-
gcloud builds submit \
--quiet \
--tag "gcr.io/$PROJECT_ID/$REPOSITORY_NAME:$GITHUB_SHA"
- I have a dedicated service account for my project
- I use a JSON key to authenticate
- I assigned it the following roles to this service account:
- roles/cloudbuild.builds.builder
- roles/cloudbuild.serviceAgent
- roles/compute.serviceAgent
- roles/container.clusterAdmin
- roles/container.serviceAgent
- roles/storage.admin
Additional information
As a workaround, I added a JSON key to the service account which got automatically created by GCP ([email protected]), use it to authenticate this action and it worked like a charm.
EDIT(June 21st 2020):
- My service account did not have the role
roles/viewer.
This page might help explain the error or at least point towards workarounds.
The error quit happening when I tried either of this ^ page's options for viewing logs:
- Grant Viewer to my custom service account. I needed to wait a few mins for the grant to take effect. or
- Specify logsBucket in cloudbuild.yaml and pass in a custom bucket
I'm still surprised I ran into this issue. It feels like the Cloud Build documentation is missing some required grants.
I ran into this today as well and as a beginner with Google Cloud this was a huge set back. I sat here troubleshooting redeploy after redeploy until this morning when I found them in my Container Registry and realized my builds were not failing after all.
I'm still surprised I ran into this issue. It feels like the Cloud Build documentation is missing some required grants.
The documentation for GCP and any of its services is 50/50 at best. I've been scouring GCP documentation for the last month for various things and most of their docs are referencing one partial service of the entire workflow very vaguely or it's missing any relevant and helpful documentation to actually get something working. Service Accounts and IAM are the worst offenders. I've been leaving a trail of feedback.
Thanks to @agray-22 's comment, I can confirm that explicitly adding the roles/viewer role to my custom service account solved the issue.
It is not clear to me why roles/storage.admin is not enough, but at least I can now use custom service accounts for my pipelines.
I can also confirm adding the Viewer role to my service account fixed the issue:
You should not set Viewer role which is too huge (project-wide). As @agray-22 explained, setting logsBucket in cloudbuild.yaml is a better solution (then you just have to set right permissions on bucket).
I had a similar problem than the OP, but in my case it was because I gave my service account the Storage Object Admin role instead of the Storage Admin role. Please double check that, the names of the roles are very similar.
read and access to all reports and post with easy step for new user , like other face book and instagram application
Data storage problem
I can also confirm adding the Viewer role to my service account fixed the issue:
![]()
Can also confirm this fixed our issue
Thanks
Hello I also confirm that the role viewer solved this issue for us ! Thanks a lot !
Can also confirm that the role viewer solved this issue for us but its bit strange as we already have storage admin role.
I can get this to work without viewer role by granting storage.objects.get on project level, on top of adding storage admin role on bucket level. While this is still on project level, at least it's a lot less broader compared to viewer role as some have pointed out.
That said, looks like this is lack of documentation on GCP rather than issue on this action.
Adding the viewer role on the service account did not fix this issue for me.
Giving the Cloud Build service account a Storage Object Viewer role worked for me.
The Viewer role gives more permissions than required.
Giving a service account view privilege in the whole project cannot be taken as a solution for production environment, that defuses the whole purpose of having service accounts.
If anyone's still having this issue (of service account does not have access to the Google Cloud Storage object) and doesn't want to give the Project Viewer role for production environments, I was able to resolve this by doing the following steps:
- On Cloud Storage (object storage) create a new bucket with the Access Control to be Fine-grained (Object-level ACLs)
- Get the bucket id (located in the Configuration tab under gsutil URI) - it'll look like gs://BUCKET_ID
- Give your service account the
Storage Object Viewerrole for that new bucket. - In your Github Actions YAML file, add this flag to the gcloud builds script:
--gcs-log-dir "gs://$BUCKET_ID". This is specifying the log bucket for the cloud build action as referenced here but using the CLI, it's --gcs-log-dir as referenced here.
Add a new secret key for BUCKET_ID which has the value of the bucket id of the cloud storage bucket into your Github keys and then your YAML should look something like this:
...
- name: Build and Push
run: |-
gcloud builds submit \
--quiet \
--gcs-log-dir "gs://$BUCKET_ID" \
--tag "gcr.io/$PROJECT_ID/$REPOSITORY_NAME:$GITHUB_SHA"
...
Hi folks - this issue does not appear to be related to the setup and installation of the gcloud CLI. For general Google Cloud support and feedback (including Cloud Build), please open an issue here.