fluent-plugin-parser-cri
fluent-plugin-parser-cri copied to clipboard
Pattern not matched issue
Hello,
I have an issue where some containerd logs are failing to be matched. Example error message from fluentd:
Error from calicos log
2022-05-16 05:09:38.523746465 +0000 fluent.warn: {"message":"[fluentd-containers.log] pattern not matched: \"2022-05-16T05:09:38.482863939Z stdout F 2022-05-16 05:09:38.482 [INFO][63] felix/summary.go 100: Summarising 10 dataplane reconciliation loops over 1m2.9s: avg=22ms longest=195ms ()\""}
Error from calico-accountant log
2022-05-16 05:09:44.515427977 +0000 fluent.warn: {"message":"[fluentd-containers.log] pattern not matched: \"2022-05-16T05:09:43.751745039Z stderr F W0516 05:09:43.751605 1 collector.go:73] Dropping scrape; all 0 counters are below minimum count of 0\""}
Im using v0.1.1 and fluent-plugin-concat v2.5.0.
@robinAwallace are you trying to use a nested parser in your configuration?
@jkasarherou I have the same issue and I'm using the nested json parser as suggested in the README of this project.
Hi, same for me after moving from docker to containerd. after looking for solution to parse this format, I updated fluentd config in EKS cluster (k8s 1.24) to be like this for containers section:
...
<source>
@type tail
@id in_tail_container_logs
path "/var/log/containers/*.log"
pos_file "/var/log/fluentd-containers.log.pos"
tag "kubernetes.*"
exclude_path /var/log/containers/*_kube-system_*
read_from_head true
<parse>
@type "cri"
merge_cri_fields false
unmatched_lines
<parse>
@type json
time_key "time"
time_format "%Y-%m-%dT%H:%M:%S.%L%z"
time_type string
</parse>
</parse>
</source>
...
the rest is all defaults, plus a match section to send all to aws opensearch. logs in fluentd containers keep saying 'pattern not matched'
[warn]: #0 [in_tail_container_logs] pattern not matched: "2022-12-21T16:15:46.891996947Z stdout F 10.1.40.206 - - [21/Dec/2022:16:15:46 +0000] \"GET / HTTP/1.1\" 200 8430 \"-\" \"kube-probe/1.24+\" \"-\""
the logs in files look like this:
2022-12-21T16:39:12.398315079Z stdout F 10.1.92.199 - - [21/Dec/2022:16:39:12 +0000] "GET / HTTP/1.1" 200 8430 "-" "kube-probe/1.24+" "-"
and
[root@ip-10-1-92-199 containers]# cat /var/log/pods/game_game-service-77d9c99c7c-rj89b_6bc7066f-4e3f-4270-8d25-50662c6f7498/game-service/0.log
2022-12-21T08:02:24.551775635Z stdout F [2022-12-21 08:02:24.551] [info] Starting with the following config:
2022-12-21T08:02:24.551823616Z stdout F {
2022-12-21T08:02:24.551865406Z stdout F "developmentMode": false,
2022-12-21T08:02:24.551895296Z stdout F "prometheus": {
2022-12-21T08:02:24.551898507Z stdout F "port": 9500,
2022-12-21T08:02:24.551902077Z stdout F "method": "pull",
2022-12-21T08:02:24.551917127Z stdout F },
2022-12-21T08:02:24.551920227Z stdout F "logger": {
2022-12-21T08:02:24.551922787Z stdout F "level": 1,
2022-12-21T08:02:24.551925457Z stdout F "format": 1
2022-12-21T08:02:24.551928457Z stdout F }
2022-12-21T08:02:24.551931257Z stdout F }
2022-12-21T08:02:57.915268903Z stdout F {"time": "2022-12-21T08:02:57.915084+00:00", "name": "Queue", "level": "debug", "process": 1, "thread": 1, "message": "Connection was closed."},
2022-12-21T08:04:57.907567607Z stdout F {"time": "2022-12-21T08:04:57.907428+00:00", "name": "Queue", "level": "debug", "process": 1, "thread": 1, "message": "Connection was closed."},
maybe the issue is the capital Z in log itself vs lowercase z in the time_format?
@Dmitry1987 your log, 10.1.92.199 - - [21/Dec/2022:16:39:12 +0000] "GET / HTTP/1.1" 200 8430 "-" "kube-probe/1.24+" "-"
, is not json. Don't put @type json
sub-parser configuration for these logs. See also https://github.com/fluent/fluent-plugin-parser-cri#configuration
@Dmitry1987 your log,
10.1.92.199 - - [21/Dec/2022:16:39:12 +0000] "GET / HTTP/1.1" 200 8430 "-" "kube-probe/1.24+" "-"
, is not json. Don't put@type json
sub-parser configuration for these logs. See also https://github.com/fluent/fluent-plugin-parser-cri#configuration
thanks, I will remove it. actually docker was the one wrapping messages to json, I forgot this fact... our app is plain text so it's a silly config mistake on my side 😅
it worked, thanks!