logging-operator
logging-operator copied to clipboard
unable to serialize JSON type logs
this is my log {
"level": "info",
"time": "2024-03-28T10:34:44.345Z",
"req": {
"id": 6,
"method": "POST",
"url": "/xx/xx/xxx",
"query": {},
"headers": {
"x-request-id": "91d4b3e2fcdf23f1c6ccccccccc90cc",
"x-real-ip": "10.100.00.000",
"x-forwarded-for": "10.100.00.000",
"x-forwarded-host": "xxxx-sit.xxxx.cn",
"x-forwarded-port": "443",
"x-forwarded-proto": "https",
"x-forwarded-scheme": "https",
"x-scheme": "https",
"x-original-forwarded-for": "10.100.00.000, 10.100.00.000",
"content-length": "59",
"user-agent": "Dart/3.1 (dart:io)",
"content-type": "application/json"
}
},
"context": "MessageService",
"error": "RESTEASY003210: Could not find resource for full path: https://ccc.ccc.ccc.com/api/v1/users.info?username=cccc",
"msg": "message log"
}
It is a JSON type log data, and my Flow configuration is this. spec: filters: - tag_normaliser: {} - parser: key_name: message parse: type: json remove_key_name_field: true reserve_data: true - record_transformer: enable_ruby: true records: - app: ${record["kubernetes"]["labels"]["app"]} - node: ${record["kubernetes"]["host"]} - namespace: ${record["kubernetes"]["namespace_name"]} remove_keys: $.kubernetes.labels,$.kubernetes.host,$.kubernetes.namespace_name globalOutputRefs: [] localOutputRefs: - output-alies match: - select: container_names: [] hosts: [] labels: idp.app.logging: 'true'
After being transmitted to Elasticsearch and displayed by Kibana, no logs can be displayed. However, if the Flow out configuration remains unchanged, my logs are
{"name":"John","age":30,"city":"New York","colors":{"first":"red","second":"blue"}}`
In Kibana, logs can be displayed and serialized.
{ "_index": "uat-2024-03", "_type": "_doc", "_id": "xxx", "_version": 1, "_score": null, "_source": { "stream": "stderr", "logtag": "F", "kubernetes": { "pod_name": "ginoneuat-xxxx-q8ht2", "pod_id": "b146652", "container_name": "ginoneuat", "docker_id": "b6f8c3cc", "container_hash": "xxxx", "container_image": "/go/goone:v15" }, "name": "John", "age": 30, "city": "New York", "colors": { "first": "red", "second": "blue" }, "app": "ginoneuat", "node": "uat-xxx-worker", "namespace": "idp", "@timestamp": "2024-03-28T23:30:14.336201933+00:00" }, "fields": { "@timestamp": [ "2024-03-28T23:30:14.336Z" ] }, "highlight": { "app": [ "@kibana-highlighted-field@ginoneuat@/kibana-highlighted-field@" ] }, "sort": [ 1711668614336 ] }
I confirm that I only modified the content of the logs and did not change any other configurations. Could you please help me understand why my actual system logs are not being displayed? Thank you!
my logging-operator version is v103.0.0+up3.17.10, installed by rancher
you should raise the log level and check the fluentd logs to understand.
also what does this mean?:
However, if the Flow out configuration remains unchanged, my logs are does it mean you don't configure any filters?
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions!