action-gh-release icon indicating copy to clipboard operation
action-gh-release copied to clipboard

The runner has received a shutdown signal

Open grische opened this issue 2 years ago β€’ 10 comments

While running the action-gh-release action, we regularly encounter issues that the operation has been cancelled. It failed after 17 minutes, so I assume it is not a timeout.

It seems to be happening after the last uploaded concluded.

Here is an example running with Debug enabled: https://github.com/freifunkMUC/site-ffm/actions/runs/4940203059/jobs/8842882643

Here are the last lines before it failed:

⬆️ Uploading x86-legacy_output.tar.gz...
##[debug]Re-evaluate condition on job cancellation for step: 'Create Release & Upload Release Assets'.
##[debug]Skip Re-evaluate condition on runner shutdown.
Error: The operation was canceled.
##[debug]System.OperationCanceledException: The operation was canceled.
##[debug]   at System.Threading.CancellationToken.ThrowOperationCanceledException()

Other (non-expired) examples are:

  • https://github.com/freifunkMUC/site-ffm/actions/runs/4940203059/jobs/8836080422
  • https://github.com/freifunkMUC/site-ffm/actions/runs/4335649576/jobs/7571678793
  • https://github.com/freifunkMUC/site-ffm/actions/runs/4335649576/jobs/7571530699

grische avatar May 11 '23 09:05 grische

@softprops is there anything else we can provide to help debugging the issue?

grische avatar Jun 02 '23 14:06 grische

like this ,workflow:

jobs:
  job_init: 
    runs-on: ubuntu-latest
    steps:
    - name: Generate Image
      id: generate_image
      run: |
        mkdir output
        cd $GITHUB_WORKSPACE/output
        dd if=/dev/zero of=outputfile.img.00 bs=1M count=1900
        dd if=/dev/zero of=outputfile.img.01 bs=1M count=1900
        dd if=/dev/zero of=outputfile.img.02 bs=1M count=1900
        dd if=/dev/zero of=outputfile.img.03 bs=1M count=1900
        dd if=/dev/zero of=outputfile.img.04 bs=1M count=1900
        dd if=/dev/zero of=outputfile.img.05 bs=1M count=1900
        echo "FIRMWARE=$PWD" >> $GITHUB_ENV
        echo "status=success" >> $GITHUB_OUTPUT
    - name: Upload Toolchain to release
      uses: softprops/[email protected]
      if: steps.generate_image.outputs.status == 'success'
      env:
        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
      with:
        tag_name: 'test'
        files: ${{ env.FIRMWARE }}/*    

return

The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled.

image

smallprogram avatar Aug 03 '23 13:08 smallprogram

Is there anyone who can help fix this issue? It bothers me so much

smallprogram avatar Sep 15 '23 17:09 smallprogram

@softprops is it possible to limit the number of files uploaded? If we send them one by one it might work properly.

grische avatar Sep 15 '23 17:09 grische

@grische I tried it and found that this problem occurs when the remaining storage space of /dev/root is less than the total size of the files to be uploaded. When the remaining storage space of /dev/root is larger than the total size of the files to be uploaded, there will be no problem. I don't know if it will be of any use to you. I don’t know if this is a solution to the problem, but for now, it works for me

smallprogram avatar Sep 19 '23 03:09 smallprogram

I encountered the same issue a few hours ago, while trying to release a few large files via this GitHub Action. @softprops , any ideas for a workaround?

seppzer0 avatar Sep 24 '23 11:09 seppzer0

I confirmed that this issue is relevant to this particular GitHub action only.

I used a similar release action to publish the same amount of files with 4.9 Gb size in total. And it worked just fine. ΠΈΠ·ΠΎΠ±Ρ€Π°ΠΆΠ΅Π½ΠΈΠ΅

In comparison, here is the same pipeline using your action. ΠΈΠ·ΠΎΠ±Ρ€Π°ΠΆΠ΅Π½ΠΈΠ΅

seppzer0 avatar Sep 24 '23 16:09 seppzer0

I ran into the same issue. I believe it is because the runner ran out of memory because this tool loads all the files into memory before uploading them:

https://github.com/softprops/action-gh-release/blob/c9b46fe7aad9f02afd89b12450b780f52dacfb2d/src/github.ts#L131

Well in my case it was 4 files, each around 1.5 GB so 6 GB total so no wonder it was too much.

It is of course a big mistake that this is happening. Upload (especially upload with potentially large files) should always be written in a way that doesn't load them into memory all at once but just one chunk at a time.

enumag avatar Nov 05 '23 21:11 enumag

Did someone find an alternative tool that doesn't have this issue? There is a ton on GitHub Actions Marketplace but so far I couldn't find one that would be similarly feature complete and not have this bug.

enumag avatar Nov 05 '23 22:11 enumag

@softprops It shouldn't be too difficult to fix this bug. Most likely you just need to use this: https://www.geeksforgeeks.org/node-js-fs-createreadstream-method/

enumag avatar Nov 05 '23 22:11 enumag