-
I am going crazy with this issue and I am starting to think that fluent-bit agent logs are processed differently than other logs. For context, I've added annotation If we compare logs of Log about {
"textPayload": "[2022/12/08 21:44:01] [ info] [filter:kubernetes:kubernetes.0] token updated",
"insertId": "6lszhugj9knbb63g",
"resource": {
"type": "k8s_container",
"labels": {
"cluster_name": "papaship",
"pod_name": "contra-fluent-bit-dk8mg",
"namespace_name": "contra",
"location": "us-central1-c",
"project_id": "contrawork",
"container_name": "fluent-bit"
}
},
"timestamp": "2022-12-08T21:44:01.200616171Z",
"severity": "ERROR",
"labels": {
"compute.googleapis.com/resource_name": "gke-papaship-primary-pool-497dd63c-mdt3",
"k8s-pod/controller-revision-hash": "5d95c84c8b",
"k8s-pod/app_kubernetes_io/name": "fluent-bit",
"k8s-pod/pod-template-generation": "22",
"k8s-pod/app_kubernetes_io/instance": "contra-fluent-bit"
},
"logName": "projects/contrawork/logs/stderr",
"receiveTimestamp": "2022-12-08T21:44:02.204954498Z"
} Other random pod log: {
"insertId": "j59mbtffrjxaj",
"jsonPayload": {
"sequence": "13000.20",
"context": {
"package": "slonik",
"reqId": "f547241c-669c-4a13-95dc-f6f148dcbfe9",
"stats": {
"idleConnectionCount": 25,
"waitingRequestCount": 0,
"totalConnectionCount": 28
},
"logLevel": 20,
"poolId": "WoKT8HVsR1ylBRsi2ewDRQ-0",
"playwright": {
"testCase": "Client can retry payment for failed payment for milestone",
"file": "/home/github/actions-runner/_work/contra-web-app/contra-web-app/tests/e2e/paid-projects/wallet-failed-payment.spec.ts",
"uid": "6368f763-3db4-4562-bdd4-7bd2f8038a6a",
"project": ""
}
},
"version": "2.0.0",
"_p": "F",
"time": 1670535546330,
"foo": "barquux",
"message": "client is checked out from the pool",
"kubernetes": {
"docker_id": "ee54113e83f047e062dc6038712d6b2bc54c9bdcdc21fb87878970c03c0682ea",
"annotations": {
"checksum/secret": "7835c4aa87b915426030e1be6f6f6bea630131f4505e99d7218ac36f6fe1e00f"
},
"host": "gke-papaship-primary-pool-497dd63c-zxhz",
"labels": {
"app.kubernetes.io/environment": "production",
"app.kubernetes.io/name": "contra-api",
"app.kubernetes.io/version": "fb1b6e14",
"app.kubernetes.io/managed-by": "helm",
"rollouts-pod-template-hash": "56df5bdbcb",
"app.kubernetes.io/instance": "contra-api"
},
"container_name": "contra-api",
"pod_id": "7e9d8647-3024-432e-b381-06d63f201e61",
"pod_name": "contra-api-56df5bdbcb-6cw5r",
"namespace_name": "contra",
"container_hash": "us-central1-docker.pkg.dev/contrawork/contra-api/production@sha256:d04c5740c017b4fb11b912f5ba8c5700be378d029c8c14178a3461b5f918a39a",
"container_image": "us-central1-docker.pkg.dev/contrawork/contra-api/production:fb1b6e14"
},
"log": "{\"context\":{\"reqId\":\"f547241c-669c-4a13-95dc-f6f148dcbfe9\",\"package\":\"slonik\",\"poolId\":\"WoKT8HVsR1ylBRsi2ewDRQ-0\",\"logLevel\":20,\"stats\":{\"idleConnectionCount\":25,\"totalConnectionCount\":28,\"waitingRequestCount\":0},\"playwright\":{\"file\":\"/home/github/actions-runner/_work/contra-web-app/contra-web-app/tests/e2e/paid-projects/wallet-failed-payment.spec.ts\",\"project\":\"\",\"testCase\":\"Client can retry payment for failed payment for milestone\",\"uid\":\"6368f763-3db4-4562-bdd4-7bd2f8038a6a\"}},\"message\":\"client is checked out from the pool\",\"sequence\":\"13000.20\",\"time\":1670535546330,\"version\":\"2.0.0\"}"
},
"resource": {
"type": "k8s_container",
"labels": {
"location": "us-central1-c",
"namespace_name": "contra",
"container_name": "contra-api",
"project_id": "contrawork",
"pod_name": "ontainers.contra-api-56df5bdbcb-6cw5r",
"cluster_name": "papaship"
}
},
"timestamp": "2022-12-08T21:39:06.330901878Z",
"logName": "projects/contrawork/logs/stdout",
"receiveTimestamp": "2022-12-08T21:39:06.575897243Z"
} My configuration: fluent-bit:
config:
customParsers: |
[PARSER]
Name fluent_bit
Format regex
Regex ^\[(?<time>[^\]]*)\] \[\s*(?<code>[^\]]*)\] (?<message>.*)$
filters: |
[FILTER]
Name kubernetes
Match kube.*
Merge_Log On
Keep_Log On
K8S-Logging.Parser On
K8S-Logging.Exclude On
Buffer_Size 64KB
Labels On
Annotations On
[FILTER]
name modify
match *
set foo barquux
inputs: |
[INPUT]
Name tail
Path /var/log/containers/*.log
multiline.parser docker, cri
Tag kube.*
Mem_Buf_Limit 5MB
Skip_Long_Lines On
Buffer_Chunk_Size 64KB
Buffer_Max_Size 128KB
[INPUT]
Name systemd
Tag host.*
Systemd_Filter _SYSTEMD_UNIT=kubelet.service
Read_From_Tail On
outputs: |
[OUTPUT]
Name stackdriver
Match kube.*
resource k8s_container
k8s_cluster_name papaship
k8s_cluster_location us-central1-c
podAnnotations:
# fluentbit.io/parser: fluent_bit
fluentbit.io/exclude: 'true' Why I am not seeing any of the Kubernetes annotations on the fluent-bit logs? When debugging this, I added stdout as an output: [OUTPUT]
name stdout
match * Which logged:
The output here looks almost like it is a log entry wrapped inside another log entry. Not sure what is happening here. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Turns out I forgot to disable native GKE log ingestion, resulting in logs being ingested twice. |
Beta Was this translation helpful? Give feedback.
Turns out I forgot to disable native GKE log ingestion, resulting in logs being ingested twice.