logstash-output-opensearch
logstash-output-opensearch copied to clipboard
[BUG] Logstash stops sending logs to AWS OpenSearch after 413 error
Describe the bug Logstash stops sending logs to AWS OpenSearch after the following error:
[2023-03-01T17:37:33,001][ERROR][logstash.outputs.opensearch][main [57b0787b301fd8e07dfd642bdcd2bbf1cba37b72e6e7b2203623917ba57312ac] Encountered a retryable error (will retry with exponential backoff) {:code=>413, :url=>"https://............:443/_bulk", :content_length=>21732354}\n","stream":"stdout","time":"2023-03-01T17:37:33.001690136Z"}
It is assumed that the solution was applied in this PR: https://github.com/opensearch-project/logstash-output-opensearch/pull/71
But the Content Length showing the error is higher than the default value of target_bulk_bytes applied in version 1.3.0 of this plugin.
The version I have of the plugin is after the Fix:
logstash-output-opensearch (2.0.0) logstash version: 8.6.1
As Workaround we always have to restart the service.
Expected behavior
Receive logs without interruptions
Plugins logstash-codec-avro (3.4.0) logstash-codec-cef (6.2.6) logstash-codec-collectd (3.1.0) logstash-codec-dots (3.0.6) logstash-codec-edn (3.1.0) logstash-codec-edn_lines (3.1.0) logstash-codec-es_bulk (3.1.0) logstash-codec-fluent (3.4.1) logstash-codec-graphite (3.0.6) logstash-codec-json (3.1.1) logstash-codec-json_lines (3.1.0) logstash-codec-line (3.1.1) logstash-codec-msgpack (3.1.0) logstash-codec-multiline (3.1.1) logstash-codec-netflow (4.3.0) logstash-codec-plain (3.1.0) logstash-codec-rubydebug (3.1.0) logstash-filter-aggregate (2.10.0) logstash-filter-anonymize (3.0.6) logstash-filter-cidr (3.1.3) logstash-filter-clone (4.2.0) logstash-filter-csv (3.1.1) logstash-filter-date (3.1.15) logstash-filter-de_dot (1.0.4) logstash-filter-dissect (1.2.5) logstash-filter-dns (3.1.5) logstash-filter-drop (3.0.5) logstash-filter-elasticsearch (3.13.0) logstash-filter-fingerprint (3.4.1) logstash-filter-geoip (7.2.12) logstash-filter-grok (4.4.3) logstash-filter-http (1.4.1) logstash-filter-json (3.2.0) logstash-filter-kv (4.7.0) logstash-filter-memcached (1.1.0) logstash-filter-metrics (4.0.7) logstash-filter-multiline (3.0.4) logstash-filter-mutate (3.5.6) logstash-filter-prune (3.0.4) logstash-filter-ruby (3.1.8) logstash-filter-sleep (3.0.7) logstash-filter-split (3.1.8) logstash-filter-syslog_pri (3.1.1) logstash-filter-throttle (4.0.4) logstash-filter-translate (3.4.0) logstash-filter-truncate (1.0.5) logstash-filter-urldecode (3.0.6) logstash-filter-useragent (3.3.3) logstash-filter-uuid (3.0.5) logstash-filter-xml (4.2.0) logstash-input-azure_event_hubs (1.4.4) logstash-input-beats (6.4.1) └── logstash-input-elastic_agent (alias) logstash-input-couchdb_changes (3.1.6) logstash-input-dead_letter_queue (2.0.0) logstash-input-elasticsearch (4.16.0) logstash-input-exec (3.6.0) logstash-input-file (4.4.4) logstash-input-ganglia (3.1.4) logstash-input-gelf (3.3.2) logstash-input-generator (3.1.0) logstash-input-google_pubsub (1.2.2) logstash-input-graphite (3.0.6) logstash-input-heartbeat (3.1.1) logstash-input-http (3.6.0) logstash-input-http_poller (5.4.0) logstash-input-imap (3.2.0) logstash-input-jms (3.2.2) logstash-input-pipe (3.1.0) logstash-input-redis (3.7.0) logstash-input-snmp (1.3.1) logstash-input-snmptrap (3.1.0) logstash-input-stdin (3.4.0) logstash-input-syslog (3.6.0) logstash-input-tcp (6.3.1) logstash-input-twitter (4.1.0) logstash-input-udp (3.5.0) logstash-input-unix (3.1.2) logstash-integration-aws (7.0.0) ├── logstash-codec-cloudfront ├── logstash-codec-cloudtrail ├── logstash-input-cloudwatch ├── logstash-input-s3 ├── logstash-input-sqs ├── logstash-output-cloudwatch ├── logstash-output-s3 ├── logstash-output-sns └── logstash-output-sqs logstash-integration-elastic_enterprise_search (2.2.1) ├── logstash-output-elastic_app_search └── logstash-output-elastic_workplace_search logstash-integration-jdbc (5.4.1) ├── logstash-input-jdbc ├── logstash-filter-jdbc_streaming └── logstash-filter-jdbc_static logstash-integration-kafka (10.12.0) ├── logstash-input-kafka └── logstash-output-kafka logstash-integration-rabbitmq (7.3.1) ├── logstash-input-rabbitmq └── logstash-output-rabbitmq logstash-output-csv (3.0.8) logstash-output-elasticsearch (11.12.1) logstash-output-email (4.1.1) logstash-output-file (4.3.0) logstash-output-google_pubsub (1.1.0) logstash-output-graphite (3.1.6) logstash-output-http (5.5.0) logstash-output-lumberjack (3.1.9) logstash-output-nagios (3.0.6) logstash-output-null (3.0.5) logstash-output-opensearch (2.0.0) logstash-output-pipe (3.0.6) logstash-output-redis (5.0.0) logstash-output-stdout (3.1.4) logstash-output-tcp (6.1.1) logstash-output-udp (3.2.0) logstash-output-webhdfs (3.0.6) logstash-patterns-core (4.3.4)
Thank you very much!
Hi @operacionesckstorm Thanks for creating the issue.
While I take a look at this issue, the original issue https://github.com/logstash-plugins/logstash-output-elasticsearch/issues/785 has a message {"Message":"Request size exceeded 10485760 bytes"}
. Did you get the same message in the 413 response?
I see that your content length is over the max limit content_length=>21732354
Hi @asifsmohammed,
I do not get this {"Message": "Request size exceeded 10485760 bytes"} The message is just the one I included above. I understand that the error code 413 is caused by the same reason, the size of the request.
Thank you for your reply!
Hi, for about two weeks now we've been getting the same error regularly: [2024-08-05T10:48:47,306][ERROR][logstash.outputs.opensearch] Encountered a retryable error (will retry with exponential backoff) {:code=>413, :url=>"https://...:443/_bulk", :content_length=>130662682}
We're using logstash-output-opensearch-1.3.0, where this should already be fixed.
Have you had a chance to look into what might be causing this error?