logstash-output-s3
logstash-output-s3 copied to clipboard
Error when deleting temporary gzip files
The s3 output uploader throws an error after uploading when it tries to delete the temporary file. This only happens if encoding is set to gzip.
Here is an example error message (full stack below):
15:15:51.050 [S3 output uploader, file: s3_temp/457c907d-f8a0-4885-9e3e-0bc0fb12dcab/json_test_s3/ls.s3.007ad655-d7a9-4480-909b-e9226208aeb6.2017-04-13T15.15.part0.txt.gz] ERROR logstash.outputs.s3 - An error occured in the on_complete
uploader {:exception=>Errno::EACCES, :message=>"Permission denied - s3_temp/457c907d-f8a0-4885-9e3e-0bc0fb12dcab/json_test_s3/ls.s3.007ad655-d7a9-4480-909b-e9226208aeb6.2017-04-13T15.15.part0.txt.gz"
Other notes:
- The upload succeeds (I see the files in the S3 bucket)
- The delete works fine if encoding is set to the default (none).
- It fails regardless if temporary_directory is set to the default or to a local directory (in the example above, it's set to a local 's3_temp' directory)
- From the file explorer, I can't open the file nor delete it until I kill the logstash process. If I copy it and unzip, it contains valid data inside.
- OS: Windows 10
If I can provide more info, please let me know.
Full Error:
15:15:51.050 [S3 output uploader, file: s3_temp/457c907d-f8a0-4885-9e3e-0bc0fb12dcab/json_test_s3/ls.s3.007ad655-d7a9-4480-909b-e9226208aeb6.2017-04-13T15.15.part0.txt.gz] ERROR logstash.outputs.s3 - An error occured in the on_complete
uploader {:exception=>Errno::EACCES, :message=>"Permission denied - s3_temp/457c907d-f8a0-4885-9e3e-0bc0fb12dcab/json_test_s3/ls.s3.007ad655-d7a9-4480-909b-e9226208aeb6.2017-04-13T15.15.part0.txt.gz", :path=>"s3_temp/457c907d-f8a0-4885-9e3e-0bc0fb12dcab/json_test_s3/ls.s3.007ad655-d7a9-4480-909b-e9226208aeb6.2017-04-13T15.15.part0.txt.gz", :backtrace=>["org/jruby/RubyFile.java:1129:in unlink'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1416:in
remove_file'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1421:in platform_support'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1415:in
remove_file'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1404:in remove'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:780:in
remove_entry'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1454:in postorder_traverse'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1454:in
postorder_traverse'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1458:in postorder_traverse'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1453:in
postorder_traverse'", "org/jruby/RubyArray.java:1613:in each'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1452:in
postorder_traverse'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1453:in postorder_traverse'", "org/jruby/RubyArray.java:1613:in
each'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:1452:in postorder_traverse'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:778:in
remove_entry'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:713:in remove_entry_secure'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:633:in
rm_r'", "org/jruby/RubyArray.java:1613:in each'", "C:/elk/logstash-5.3.0/vendor/jruby/lib/ruby/1.9/fileutils.rb:631:in
rm_r'", "C:/elk/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.6/lib/logstash/outputs/s3/temporary_file.rb:54:in delete!'", "C:/elk/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.6/lib/logstash/outputs/s3.rb:361:in
clean_temporary_file'", "org/jruby/RubyMethod.java:120:in call'", "C:/elk/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.6/lib/logstash/outputs/s3/uploader.rb:49:in
upload'", "C:/elk/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.6/lib/logstash/outputs/s3/uploader.rb:29:in upload_async'", "org/jruby/RubyProc.java:281:in
call'", "C:/elk/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/concurrent-ruby-1.0.0-java/lib/concurrent/executor/java_executor_service.rb:94:in run'", "Concurrent$$JavaExecutorService$$Job_1733552327.gen:13:in
run'"]}
Does the Logstash user can delete file in the s3_temp
directory? Do you see the following the file s3_temp/457c907d-f8a0-4885-9e3e-0bc0fb12dcab/json_test_s3/ls.s3.007ad655-d7a9-4480-909b-e9226208aeb6.2017-04-13T15.15.part0.txt.gz
in the s3_temp
directory?
Yes, the file is there. If I try to delete it says: "The action can't be completed because the file is open in Java(TM) Platform SE binary".
If if kill the logstash process, then I can delete file.
If I use encoding => "none", then logstash does not fail when deleting the temporary file.
So this look like LS doesn't release the file handle correctly and keep a reference to it, windows is a bit more strict on the file usage and that might result in the permission denied error you are experiencing. We will take a look as soon as possible.
Yes, in the gzip version it looks like the file handle is wrapped in some other class (IOWrappedGzip). Unfortunately I am not familiar enough with ruby to resolve this.
Thank you for taking a look. I am happy to test any potential fix.
Hi @ph, when do you think you will have time to take a deeper look at this issue? Thanks for your support.
btw one of the side effects is that when you run logstash again, it picks up the undeleted file in the s3_temp folder and uploads it again to s3 (along with whatever new inputs you have).
I'm getting the same issue. Anyone has an update on this?
No, no luck.
On Tue, Aug 8, 2017 at 10:55 PM, kkirov [email protected] wrote:
I'm getting the same issue. Anyone has an update on this?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/logstash-plugins/logstash-output-s3/issues/136#issuecomment-321079194, or mute the thread https://github.com/notifications/unsubscribe-auth/AC_9nbV2fRheIqMgcvQn3agBC6Q410D7ks5sWMtDgaJpZM4M8tMQ .
I have the same problem, so far no solution, this occurs when the file is gzip
Same problem here on Linux 😦
[2019-11-23T18:38:29,760][ERROR][logstash.outputs.s3 ] An error occured in the
on_complete
uploader {:exception=>ArgumentError, :message=>"parent directory is world writable, FileUtils#remove_entry_secure does not work; abort: "/tmp/logstash/96bf1e71-0e84-49e2-a099-6902abfb541c" (parent directory mode 40777)", :path=>"/tmp/logstash/96bf1e71-0e84-49e2-a099-6902abfb541c/ls.s3.ec36987f-e2f3-46ce-891c-d5cc5f456181.2019-11-16T01.36.part36.txt.gz"