vector icon indicating copy to clipboard operation
vector copied to clipboard

`kubernetes_logs` source seems to fail to annotate when pod names are reused

Open jszwedko opened this issue 1 year ago • 4 comments

A note for the community

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Problem

Vector seems to fail to annotate logs from the kubernetes_logs source when pod names are reused. I didn't reproduce this, but there are a few comments (in the references section) that seem to indicate this is the case.

One hypothesis is that https://github.com/vectordotdev/vector/blob/aee1984c15ff8e43992ce80fc4f3f1b95220f4fc/src/sources/kubernetes_logs/pod_metadata_annotator.rs#L100 should be looking up by the uid as well (under extra) rather than just name and namespace.

Configuration

No response

Version

vector 0.22.3

Debug Output

No response

Example Data

No response

Additional Context

No response

References

  • https://github.com/vectordotdev/vector/issues/12909#issuecomment-1169088156
  • https://github.com/vectordotdev/vector/issues/12014#issuecomment-1168953164 (and following comments)

jszwedko avatar Jul 07 '22 15:07 jszwedko

Hello. Any updates here? We are suffering :(

r0b3r4 avatar Jul 29 '22 19:07 r0b3r4

Hi @r0b3r4 ! Unfortunately not yet, but it is on our backlog. We are also happy to help guide a contribution if someone has the time and ability to address it.

jszwedko avatar Jul 29 '22 19:07 jszwedko

Hello,

Error

Failed to annotate event with pod metadata

is happened here

https://github.com/vectordotdev/vector/blob/c402e6ed94400959dbc298d4b342dcb0842089d4/src/sources/kubernetes_logs/mod.rs#L498

and None is maybe happened here https://github.com/vectordotdev/vector/blob/c402e6ed94400959dbc298d4b342dcb0842089d4/src/sources/kubernetes_logs/pod_metadata_annotator.rs#L99 or here https://github.com/vectordotdev/vector/blob/c402e6ed94400959dbc298d4b342dcb0842089d4/src/sources/kubernetes_logs/pod_metadata_annotator.rs#L101

Second variant is more suitable

And i think it's happened because stored information about pod is deleted with event Deleted https://github.com/vectordotdev/vector/blob/c402e6ed94400959dbc298d4b342dcb0842089d4/src/kubernetes/reflector.rs#L38

After pod is deleted (some one wrote that problem is appear when the pod scale down and up and it's name is the same) event Delete is delayed.
The new event Create, store information about new created pod into Store, but when delay timer exceeded - the information about old pod with the same name is deleted, and because new pod has the same name from Store deleted information about new create pod And code self.pods_state_reader.get(&obj)?; return None

It's just my opinion based on my investigation of the problem. I cannot reproduce the problem by myself, and my opinion it's just a theory

sillent avatar Aug 05 '22 11:08 sillent

comment: @jszwedko

One hypothesis is that ... should be looking up by the uid as well (under extra) rather than just name and namespace

I think it won't work

  1. We can't take uid, that presented in Kubernetes Resource

he looks like this uid: 5751f785-5689-4868-a573-d9e0096821ff but the pod_uid that we can get from file_info is not that uid that we need

  1. extra not implement Hash and PartialEq

https://github.com/kube-rs/kube-rs/blob/bf885f05c39c046f68e34f19b95402d3326c1715/kube-runtime/src/reflector/object_ref.rs#L56

and when we try

https://github.com/vectordotdev/vector/blob/c402e6ed94400959dbc298d4b342dcb0842089d4/src/sources/kubernetes_logs/pod_metadata_annotator.rs#L101

the method at work will not be based on the data of this field

sillent avatar Aug 10 '22 05:08 sillent

Thanks @sillent . It does appear to be the case that the extra data isn't included in comparisons of pod info. That strikes me as a bit odd. I wonder if we should open an issue upstream on kube-rs to get their thoughts around this.

jszwedko avatar Aug 12 '22 18:08 jszwedko

Maybe we should consider having our own metadata cache based on uid instead of changing the event order of kube-rs store. It seems that the whole kube-rs is built around resource names instead of uids so it can be hard to just add uid support.

shenxn avatar Aug 31 '22 13:08 shenxn

I have steps to reproduce the problem. Hope it helps.

Arch: x86_64 OS: Ubuntu 20.04

  1. Install minikube & helm
curl -LO https://storage.googleapis.com/minikube/releases/latest/minikube-linux-amd64
sudo install minikube-linux-amd64 /usr/local/bin/minikube
minikube start

curl https://raw.githubusercontent.com/helm/helm/main/scripts/get-helm-3 | bash
  1. Install vector using helm
helm repo add vector https://helm.vector.dev
helm repo update

cat <<EOF > values.yaml
role: Agent
service:
  enabled: false
image:
  pullPolicy: IfNotPresent
  repository: timberio/vector
  tag: 0.24.1-distroless-libc
customConfig:
  data_dir: /vector-data-dir
  api:
    enabled: true
    address: 127.0.0.1:8686
    playground: false
  sources:
    kubernetes_logs:
      type: kubernetes_logs
      extra_field_selector: "metadata.namespace=app"
      delay_deletion_ms: 30000
      glob_minimum_cooldown_ms: 5000
  sinks:
    stdout:
      type: blackhole
      inputs: [kubernetes_logs]
      print_interval_secs: 1
EOF

helm install vector vector/vector --namespace vector --create-namespace --values values.yaml
  1. Deploy a StatefulSet that generates logs
cat <<EOF > stateful.yaml
apiVersion: apps/v1
kind: StatefulSet
metadata:
  name: flog
  namespace: app
spec:
  selector:
    matchLabels:
      app: flog
  serviceName: flog
  template:
    metadata:
      labels:
        app: flog
    spec:
      containers:
      - name: flog
        image: mingrammer/flog
        imagePullPolicy: IfNotPresent
        command: ['/bin/flog', '--delay', '300ms', '--loop']
EOF

kubectl create namespace app
kubectl apply -f stateful.yaml
  1. Wait a few seconds and restart StatefulSet
kubectl rollout restart statefulset flog -n app
  1. After delay_deletion_ms (30s), got errors and no longer watching the file
kubectl logs -n vector daemonset/vector -f

...
2022-10-11T07:18:50.138699Z  INFO vector::sinks::blackhole::sink: Total events collected events=0 raw_bytes_collected=0
2022-10-11T07:18:51.139505Z  INFO vector::sinks::blackhole::sink: Total events collected events=0 raw_bytes_collected=0
2022-10-11T07:18:52.138783Z  INFO vector::sinks::blackhole::sink: Total events collected events=0 raw_bytes_collected=0
2022-10-11T07:18:53.139386Z  INFO vector::sinks::blackhole::sink: Total events collected events=0 raw_bytes_collected=0
2022-10-11T07:18:54.139460Z  INFO vector::sinks::blackhole::sink: Total events collected events=0 raw_bytes_collected=0
2022-10-11T07:18:55.139002Z  INFO vector::sinks::blackhole::sink: Total events collected events=0 raw_bytes_collected=0
2022-10-11T07:18:55.385353Z  INFO source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}:file_server: vector::internal_events::file::source: Found new file to watch. file=/var/log/pods/app_flog-0_41bb0f6a-a088-472f-abab-2cc656ce5a55/flog/0.log
2022-10-11T07:18:56.139296Z  INFO vector::sinks::blackhole::sink: Total events collected events=17 raw_bytes_collected=63827
2022-10-11T07:18:57.139732Z  INFO vector::sinks::blackhole::sink: Total events collected events=21 raw_bytes_collected=78965
2022-10-11T07:18:58.138776Z  INFO vector::sinks::blackhole::sink: Total events collected events=23 raw_bytes_collected=86516
2022-10-11T07:18:59.138964Z  INFO vector::sinks::blackhole::sink: Total events collected events=27 raw_bytes_collected=101657
2022-10-11T07:19:00.139495Z  INFO vector::sinks::blackhole::sink: Total events collected events=30 raw_bytes_collected=112995
2022-10-11T07:19:01.139166Z  INFO vector::sinks::blackhole::sink: Total events collected events=33 raw_bytes_collected=124380
2022-10-11T07:19:02.139552Z  INFO vector::sinks::blackhole::sink: Total events collected events=37 raw_bytes_collected=139510
2022-10-11T07:19:03.139134Z  INFO vector::sinks::blackhole::sink: Total events collected events=40 raw_bytes_collected=150888
2022-10-11T07:19:04.139703Z  INFO vector::sinks::blackhole::sink: Total events collected events=44 raw_bytes_collected=166004
2022-10-11T07:19:05.139373Z  INFO vector::sinks::blackhole::sink: Total events collected events=47 raw_bytes_collected=177326
2022-10-11T07:19:06.139341Z  INFO vector::sinks::blackhole::sink: Total events collected events=51 raw_bytes_collected=192442
2022-10-11T07:19:07.139243Z  INFO vector::sinks::blackhole::sink: Total events collected events=53 raw_bytes_collected=200011
2022-10-11T07:19:08.139022Z  INFO vector::sinks::blackhole::sink: Total events collected events=56 raw_bytes_collected=211358
2022-10-11T07:19:09.138961Z  INFO vector::sinks::blackhole::sink: Total events collected events=60 raw_bytes_collected=226545
2022-10-11T07:19:10.138944Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:11.139492Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:11.511156Z  WARN source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}:file_server: vector::internal_events::file::source: Currently ignoring file too small to fingerprint. file=/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log
2022-10-11T07:19:11.511228Z  INFO source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}:file_server: vector::internal_events::file::source: Stopped watching file. file=/var/log/pods/app_flog-0_41bb0f6a-a088-472f-abab-2cc656ce5a55/flog/0.log
2022-10-11T07:19:12.139431Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:13.139277Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:14.139486Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:15.139013Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:16.139594Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:17.139294Z  INFO vector::sinks::blackhole::sink: Total events collected events=62 raw_bytes_collected=234098
2022-10-11T07:19:17.658813Z  INFO source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}:file_server: vector::internal_events::file::source: Found new file to watch. file=/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log
2022-10-11T07:19:18.139459Z  INFO vector::sinks::blackhole::sink: Total events collected events=83 raw_bytes_collected=317571
2022-10-11T07:19:19.139381Z  INFO vector::sinks::blackhole::sink: Total events collected events=86 raw_bytes_collected=329623
2022-10-11T07:19:20.139517Z  INFO vector::sinks::blackhole::sink: Total events collected events=90 raw_bytes_collected=345653
2022-10-11T07:19:21.139715Z  INFO vector::sinks::blackhole::sink: Total events collected events=93 raw_bytes_collected=357649
2022-10-11T07:19:22.138940Z  INFO vector::sinks::blackhole::sink: Total events collected events=97 raw_bytes_collected=373676
2022-10-11T07:19:23.139384Z  INFO vector::sinks::blackhole::sink: Total events collected events=100 raw_bytes_collected=385710
2022-10-11T07:19:24.138941Z  INFO vector::sinks::blackhole::sink: Total events collected events=103 raw_bytes_collected=397726
2022-10-11T07:19:25.139395Z  INFO vector::sinks::blackhole::sink: Total events collected events=107 raw_bytes_collected=413761
2022-10-11T07:19:26.138829Z  INFO vector::sinks::blackhole::sink: Total events collected events=110 raw_bytes_collected=425802
2022-10-11T07:19:27.139129Z  INFO vector::sinks::blackhole::sink: Total events collected events=113 raw_bytes_collected=437853
2022-10-11T07:19:28.139585Z  INFO vector::sinks::blackhole::sink: Total events collected events=117 raw_bytes_collected=453871
2022-10-11T07:19:29.138723Z  INFO vector::sinks::blackhole::sink: Total events collected events=119 raw_bytes_collected=461851
2022-10-11T07:19:30.139117Z  INFO vector::sinks::blackhole::sink: Total events collected events=123 raw_bytes_collected=477880
2022-10-11T07:19:31.139719Z  INFO vector::sinks::blackhole::sink: Total events collected events=126 raw_bytes_collected=489856
2022-10-11T07:19:32.139152Z  INFO vector::sinks::blackhole::sink: Total events collected events=130 raw_bytes_collected=505897
2022-10-11T07:19:33.138772Z  INFO vector::sinks::blackhole::sink: Total events collected events=133 raw_bytes_collected=517883
2022-10-11T07:19:34.139463Z  INFO vector::sinks::blackhole::sink: Total events collected events=137 raw_bytes_collected=533925
2022-10-11T07:19:35.139263Z  INFO vector::sinks::blackhole::sink: Total events collected events=140 raw_bytes_collected=545969
2022-10-11T07:19:36.139564Z  INFO vector::sinks::blackhole::sink: Total events collected events=143 raw_bytes_collected=558021
2022-10-11T07:19:37.139875Z  INFO vector::sinks::blackhole::sink: Total events collected events=147 raw_bytes_collected=574050
2022-10-11T07:19:38.138709Z  INFO vector::sinks::blackhole::sink: Total events collected events=150 raw_bytes_collected=586065
2022-10-11T07:19:39.139025Z  INFO vector::sinks::blackhole::sink: Total events collected events=153 raw_bytes_collected=598091
2022-10-11T07:19:40.139408Z  INFO vector::sinks::blackhole::sink: Total events collected events=157 raw_bytes_collected=614132
2022-10-11T07:19:41.131602Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"95.78.162.196 - - [11/Oct/2022:07:19:40 +0000] \\\"POST /plug-and-play/infomediaries/back-end/partnerships HTTP/2.0\\\" 502 26105\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:40.910539473Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:41.131434537Z)}), size_cache: AtomicCell { value: Some(641) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:41.138897Z  INFO vector::sinks::blackhole::sink: Total events collected events=160 raw_bytes_collected=623012
2022-10-11T07:19:41.264256Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"244.25.77.220 - - [11/Oct/2022:07:19:40 +0000] \\\"DELETE /innovative/systems/bandwidth/rich HTTP/1.1\\\" 401 212\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:41.210739632Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:41.264157015Z)}), size_cache: AtomicCell { value: Some(625) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:41.528271Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"94.186.229.214 - spencer6681 [11/Oct/2022:07:19:41 +0000] \\\"GET /strategize/compelling HTTP/1.1\\\" 304 27222\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:41.51087386Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:41.528173029Z)}), size_cache: AtomicCell { value: Some(622) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:42.048539Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"158.241.102.206 - - [11/Oct/2022:07:19:41 +0000] \\\"POST /vortals HTTP/2.0\\\" 204 5896\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:41.811246315Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:42.048425827Z)}), size_cache: AtomicCell { value: Some(600) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:42.116620Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"81.60.133.76 - - [11/Oct/2022:07:19:41 +0000] \\\"POST /next-generation/plug-and-play HTTP/1.0\\\" 501 12366\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:42.111452547Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:42.116528443Z)}), size_cache: AtomicCell { value: Some(620) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:42.139693Z  INFO vector::sinks::blackhole::sink: Total events collected events=164 raw_bytes_collected=626348
2022-10-11T07:19:42.636272Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"3.227.108.2 - runte6855 [11/Oct/2022:07:19:42 +0000] \\\"HEAD /integrate/disintermediate/productize HTTP/2.0\\\" 100 27015\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:42.411685455Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:42.636161954Z)}), size_cache: AtomicCell { value: Some(634) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:42.769753Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"117.108.17.233 - king2280 [11/Oct/2022:07:19:42 +0000] \\\"PUT /experiences HTTP/1.1\\\" 203 28004\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:42.711893151Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:42.769613709Z)}), size_cache: AtomicCell { value: Some(610) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:43.032013Z ERROR source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}: vector::internal_events::kubernetes_logs: Failed to annotate event with pod metadata. event=Log(LogEvent { inner: Inner { fields: Object({"file": Bytes(b"/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log"), "message": Bytes(b"{\"log\":\"216.225.220.81 - - [11/Oct/2022:07:19:42 +0000] \\\"DELETE /functionalities/viral/magnetic/bricks-and-clicks HTTP/1.0\\\" 404 29636\\n\",\"stream\":\"stdout\",\"time\":\"2022-10-11T07:19:43.012386844Z\"}"), "source_type": Bytes(b"kubernetes_logs"), "timestamp": Timestamp(2022-10-11T07:19:43.031919873Z)}), size_cache: AtomicCell { value: Some(643) } }, metadata: EventMetadata { value: Object({}), secrets: {}, finalizers: EventFinalizers([]), schema_definition: Definition { event_kind: Kind { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), undefined: Some(()), array: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }), object: Some(Collection { known: {}, unknown: Unknown(Infinite(Infinite { bytes: Some(()), integer: Some(()), float: Some(()), boolean: Some(()), timestamp: Some(()), regex: Some(()), null: Some(()), array: Some(()), object: Some(()) })) }) }, metadata_kind: Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: None, array: None, object: Some(Collection { known: {}, unknown: Unknown(Exact(Kind { bytes: None, integer: None, float: None, boolean: None, timestamp: None, regex: None, null: None, undefined: Some(()), array: None, object: None })) }) }, meaning: {}, log_namespaces: {Vector, Legacy} } } }) error_code="annotation_failed" error_type="reader_failed" stage="processing"
2022-10-11T07:19:43.139729Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
2022-10-11T07:19:43.293952Z  INFO source{component_kind="source" component_id=kubernetes_logs component_type=kubernetes_logs component_name=kubernetes_logs}:file_server: vector::internal_events::file::source: Stopped watching file. file=/var/log/pods/app_flog-0_976669c2-8cd7-41b5-9a13-bd481c386c7f/flog/0.log
2022-10-11T07:19:44.138910Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
2022-10-11T07:19:45.139601Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
2022-10-11T07:19:46.139902Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
2022-10-11T07:19:47.138870Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
2022-10-11T07:19:48.138703Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
2022-10-11T07:19:49.139712Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
2022-10-11T07:19:50.139388Z  INFO vector::sinks::blackhole::sink: Total events collected events=167 raw_bytes_collected=628886
...

zhongzc avatar Oct 10 '22 10:10 zhongzc

Any news :)?

r0b3r4 avatar Oct 24 '22 05:10 r0b3r4

See my comment here, but in summary version 0.25.0 seems to contain a temporary or partial fix to this issue. I wasn't able to locate anything mentioning this in the release notes for this version, so some people may have missed it.

After upgrading to this version a couple weeks ago, I haven't had any annotation issues.

CharlieC3 avatar Dec 01 '22 20:12 CharlieC3

The only thing that confuses me that fix somehow connected with non-related options. As i understand this might works only like temporary kludge, right?

version >=0.25.x +

max_line_bytes: 16777216
max_read_bytes: 16777216
glob_minimum_cooldown_ms: 1000

r0b3r4 avatar Dec 05 '22 06:12 r0b3r4

@r0b3r4 The config options were only helpful in the linked issue for me, while upgrading to version >=0.25.x is specifically what helped me resolve this annotation issue.

CharlieC3 avatar Dec 05 '22 16:12 CharlieC3