logstash-output-s3 icon indicating copy to clipboard operation
logstash-output-s3 copied to clipboard

S3 output plugin always reports AWS::S3::Errors::RequestTimeout

Open ashalitkin opened this issue 9 years ago • 1 comments

I'm using the latest logstash (2.3.1): https://download.elastic.co/logstash/logstash/logstash-2.3.1.zip on Windows (tried on 7 and 2008 R2)

Here is the configuration:

input {
    file {
        path => "some_log_file.logs"
        start_position => "beginning"
        ignore_older => 0
    }
}

output{
    s3 {
        access_key_id => "some_id"          
        secret_access_key => "some_key"
        region => "us-east-1"         
        bucket => "some_bucket" 
        prefix => "/Logs/"
        time_file => 1
        size_file => 2048
        canned_acl => "public_read_write"
        temporary_directory => "some_folder"
    }
    stdout{}
}

Plugin is registered correctly and test file is successfully uploaded but the next attempt fails. Here is what I have in logs: For test file:

{:timestamp=>"2016-04-11T16:34:43.997000+0000", :message=>"S3: Creating a test file on S3", :level=>:debug, :file=>"/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-output-s3-2.0.7/lib/logstash/outputs/s3.rb", :line=>"242", :method=>"test_s3_write"}
{:timestamp=>"2016-04-11T16:34:44.000000+0000", :message=>"S3: ready to write file in bucket", :remote_filename=>"/Logs/logstash-programmatic-access-test-object-1460392483", :bucket=>"some_bucket", :level=>:debug, :file=>"/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-output-s3-2.0.7/lib/logstash/outputs/s3.rb", :line=>"170", :method=>"write_on_bucket"}
{:timestamp=>"2016-04-11T16:34:45.966000+0000", :message=>"S3: has written remote file in bucket with canned ACL", :remote_filename=>"/Logs/logstash-programmatic-access-test-object-1460392483", :bucket=>"some_bucket", :canned_acl=>"public_read_write", :level=>:debug, :file=>"/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-output-s3-2.0.7/lib/logstash/outputs/s3.rb", :line=>"183", :method=>"write_on_bucket"}

For the next attempt:

{:timestamp=>"2016-04-11T16:35:11.843000+0000", :message=>"S3: ready to write file in bucket", :remote_filename=>"/Logs/ls.s3.WIN-0QTMO6ENPPD.2016-04-11T16.34.part0.txt", :bucket=>"some_bucket", :level=>:debug, :file=>"/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-output-s3-2.0.7/lib/logstash/outputs/s3.rb", :line=>"170", :method=>"write_on_bucket"}
...
{:timestamp=>"2016-04-11T16:37:56.526000+0000", :message=>"S3: AWS error", :error=>#<AWS::S3::Errors::RequestTimeout: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.>, :level=>:error, :file=>"/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-output-s3-2.0.7/lib/logstash/outputs/s3.rb", :line=>"178", :method=>"write_on_bucket"}

ashalitkin avatar Apr 11 '16 17:04 ashalitkin

Hmmm, it seems like we aren't checking for stale conns correctly.

andrewvc avatar May 17 '16 16:05 andrewvc