aws-cli
aws-cli copied to clipboard
Broken pipe error when piping "s3 ls" output to grep -q
Confirm by changing [ ] to [x] below to ensure that it's a bug:
- [x] I've gone though the User Guide and the API reference
- [x] I've searched for previous similar issues and didn't find any solution
Describe the bug
[Errno 32] Broken pipe
is raised when aws s3 ls
output is piped to grep -q
and the matching string is found; exit code is 255.
SDK version number aws-cli/1.18.220 Python/2.7.17 botocore/1.19.60
Platform/OS/Hardware/Device Linux/4.15.0-134-generic x86_64, Ubuntu 18.04.5 LTS
To Reproduce (observed behavior) Steps to reproduce the behavior
set -o pipefail; aws s3 ls s3://SOME_S3_PATH | grep -q SOME_STR; echo $?
[Errno 32] Broken pipe
255
Expected behavior No failure, just a clean exit with code 0.
Additional context
Command grep -q
will stop immediately after the first match, and the program which is writing to the pipe will receive SIGPIPE. It is clear, that in case of s3 ls
this signal can be ignored.
Debug shows this:
2021-01-28 20:03:03,419 - MainThread - awscli.clidriver - DEBUG - Exception caught in main()
Traceback (most recent call last):
File "awscli/clidriver.py", line 457, in main
File "awscli/customizations/commands.py", line 198, in __call__
File "awscli/customizations/commands.py", line 191, in __call__
File "awscli/customizations/s3/subcommands.py", line 509, in _run_main
File "awscli/customizations/s3/subcommands.py", line 588, in _list_all_objects_recursive
File "awscli/customizations/s3/subcommands.py", line 566, in _display_page
File "awscli/customizations/utils.py", line 181, in uni_print
BrokenPipeError: [Errno 32] Broken pipe
[Errno 32] Broken pipe
Exception ignored in: <_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>
BrokenPipeError: [Errno 32] Broken pipe
I get an exit code of 120, not 255.
Looks like we would need to do this to resolve this:
https://docs.python.org/3/library/signal.html#note-on-sigpipe
Fixed this in PR https://github.com/aws/aws-cli/pull/6303, please review :)
Activelly cc'ing @kdaily as this thread is a bit slow paced and somewhat quiet
Hi @bcap,
Thanks for the PR, marking this issue to be reviewed.
aws s3 ls s3://XXXX | head -n 1
does the same
currently im buffering it via a tmp file
aws s3 ls s3://XXXX > /tmp/aws-log.txt && cat /tmp/aws-log.txt | head -n 1
but I'm sure this is a more elegant way
I'm seeing the same behaviour piping to head
as @FergusFettes. Also seeing it when piping to grep
with -m
to limit results, e.g:
aws s3 ls s3://my-bucket/ | grep -m 10 -e 'regex'
I assume the pipe is broken because head
is completing before aws s3 ls
does, and it's particularly noticeable if the number of items being listed is much greater than the number of items being filtered with head
.
I don't know enough about Linux programming in Python to know how to fix it, but I think buffering it through a temp file is probably the simplest fix!
I get an exit code of 120, not 255.
Got a similar issue with also exit code = 120. Unable to find an explanation.
My platform differs from the OP but the result seems to be the same.
This works:
aws s3 ls ... | grep ...
If I add -q
to grep and there's a regex match, it crashes:
diego@ip-192-168-0-103 ~/Desktop % aws s3 ls s3://bucket/prefix/ | grep -q 'stuff'
[Errno 32] Broken pipe
Exception ignored in: <_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>
BrokenPipeError: [Errno 32] Broken pipe
If there's no match, the command exits normally with code 1 as expected.
My shell is zsh 5.9 and aws --version
gives:
aws-cli/2.15.21 Python/3.11.7 Darwin/23.2.0 source/arm64 prompt/off