fluent-bit icon indicating copy to clipboard operation
fluent-bit copied to clipboard

High Memory Usage issue

Open poliphilson opened this issue 3 weeks ago • 2 comments

Bug Report

Describe the bug

I've observed continuous memory consumption reaching up to 49GB. Here's my configuration - what could be the issue?

service: |
    [SERVICE]
        daemon Off
        flush 1
        Log_Level info
        parsers_File /fluent-bit/etc/parsers.conf
        parsers_File /fluent-bit/etc/conf/custom_parsers.conf
        http_server On
        http_listen 0.0.0.0
        http_port 2020
        Health_Check On
        storage.path /var/log/fluent-bit-storage
        storage.sync normal
        storage.checksum off
        storage.backlog.mem_limit 5M

  inputs: |
    [INPUT]
        Name tail
        Path /var/log/containers/*.log
        Tag kubernetes-was.*
        DB /var/log/fluent-bit_tail.db
        mem_buf_limit 5M
        storage.type filesystem
        storage.pause_on_chunks_overlimit on

    [INPUT]
        Name tail
        Path /var/log/containers/*.log
        Tag kubernetes-system.*
        DB /var/log/fluent-bit_system_tail.db
        mem_buf_limit 5M
        storage.type filesystem
        storage.pause_on_chunks_overlimit on

    [INPUT]
        Name systemd
        Path /var/log/journal
        Tag systemd
        Systemd_Filter _SYSTEMD_UNIT=kubelet.service
        Systemd_Filter _SYSTEMD_UNIT=containerd.service
        Read_From_Tail On
        Strip_Underscores On
        mem_buf_limit 5M
        storage.type filesystem
        storage.pause_on_chunks_overlimit on

  filters: |
    [FILTER]
        Name kubernetes
        Match kubernetes-was.*
        Use_Kubelet On
        Tls.verify Off
        Labels Off
        Annotations On
        Kube_Tag_Prefix kubernetes-was.var.log.containers.
        Buffer_Size 1M

    [FILTER]
        Name kubernetes
        Match kubernetes-system.*
        Use_Kubelet On
        Tls.verify Off
        Labels Off
        Annotations On
        Kube_Tag_Prefix kubernetes-system.var.log.containers.
        Buffer_Size 1M

    [FILTER]
        Name grep
        Match kubernetes-was.*
        Regex $kubernetes['annotations']['logging/app'] .*

    [FILTER]
        Name grep
        Match kubernetes-system.*
        Regex $kubernetes['namespace_name'] ^(kube-system|default|etc)$

  outputs: |
    [OUTPUT]
        Name forward
        Match kubernetes-was.*
        Host vector
        Port 24224
        storage.total_limit_size 5G
        workers 2

    [OUTPUT]
        Name forward
        Match kubernetes-system.*
        Host vector
        Port 24225
        storage.total_limit_size 5G
        workers 2

    [OUTPUT]
        Name forward
        Match systemd
        Host vector
        Port 24226
        storage.total_limit_size 5G
        workers 2

To Reproduce

  • Rubular link if applicable:
  • Example log message if applicable:
{"log":"YOUR LOG MESSAGE HERE","stream":"stdout","time":"2018-06-11T14:37:30.681701731Z"}
  • Steps to reproduce the problem:

Expected behavior

Screenshots

Image

Your Environment

  • Version used: 4.0.3
  • Configuration:
  • Environment name and version (e.g. Kubernetes? What version?): kubernetes
  • Server type and version:
  • Operating System and version:
  • Filters and plugins:

Additional context

poliphilson avatar Dec 03 '25 01:12 poliphilson

are you able to replicate the same high memory issue with the latest v4.2.0 version ?

edsiper avatar Dec 04 '25 23:12 edsiper

@edsiper I'm not sure. Right now, I've migrated to vector and there are no issues. But I don't know why the problem is occurring.

poliphilson avatar Dec 05 '25 07:12 poliphilson