aws-otel-lambda
aws-otel-lambda copied to clipboard
[Nodejs] AWS auto-instrumentation doesn't propagate parent trace.id and span.id properly
Using nodejs lambdas.
I am trying to get my consumer
lambda to call the producer
lambda and propagate the trace context between them using traceparent
HTTP header.
So, I have the env variable set:
OTEL_PROPAGATORS: tracecontext
I am also seeing the code performing the header injection properly and passing it from the consumer
lambda. It is received by the producer
lambda, as per the logs.
Consumer logs show the span with the correct span.id
to be propagated:
InstrumentationLibrary @opentelemetry/instrumentation-http 0.19.0
--
Span #0
Trace ID : 60e6a7a924cc736a604af2a602d84600 <==== TRACE ID
Parent ID : 2430b46dd43fac16
ID : 81eb76e307cb48a4 <==== SPAN ID
Name : HTTPS GET
Kind : SPAN_KIND_CLIENT
Start time : 2021-07-08 07:22:19.353159168 +0000 UTC
End time : 2021-07-08 07:22:20.792924928 +0000 UTC
Status code : STATUS_CODE_OK
Status message :
Attributes:
-> http.url: STRING(https://pgzr057had.execute-api.ap-southeast-2.amazonaws.com/dev/ping)
-> http.method: STRING(GET)
-> http.target: STRING(/dev/ping)
-> net.peer.name: STRING(pgzr057had.execute-api.ap-southeast-2.amazonaws.com)
-> net.peer.ip: STRING(13.226.106.129)
-> net.peer.port: INT(443)
-> http.host: STRING(pgzr057had.execute-api.ap-southeast-2.amazonaws.com:443)
-> http.response_content_length_uncompressed: INT(88)
-> http.status_code: INT(200)
-> http.status_text: STRING(OK)
-> http.flavor: STRING(1.1)
-> net.transport: STRING(IP.TCP)
Producer logs receiving the traceparent
header with correct propagation data:
"traceparent": "00-60e6a7a924cc736a604af2a602d84600-81eb76e307cb48a4-01",
However, the span that the producer
lambda generates has a totally different parent from the one that is sent in the traceparent
header:
2021-07-08T07:22:20.781Z DEBUG loggingexporter/logging_exporter.go:48 ResourceSpans #0
--
Resource labels:
-> service.name: STRING(aws-node-simple-http-endpoint-dev-consumer)
-> telemetry.sdk.language: STRING(nodejs)
-> telemetry.sdk.name: STRING(opentelemetry)
-> telemetry.sdk.version: STRING(0.19.0)
-> cloud.provider: STRING(aws)
-> cloud.region: STRING(ap-southeast-2)
-> faas.name: STRING(aws-node-simple-http-endpoint-dev-consumer)
-> faas.version: STRING($LATEST)
-> process.id: INT(18)
-> process.executable.name: STRING(/var/lang/bin/node)
-> process.command: STRING(/var/runtime/index.js)
-> process.command_line: STRING(/var/lang/bin/node /var/runtime/index.js)
InstrumentationLibrarySpans #0
InstrumentationLibrary @opentelemetry/instrumentation-aws-lambda 0.16.0
Span #0
Trace ID : 60e6a7ab0e1a895f1930b0f93004a373 <=== Different TRACE ID
Parent ID : 05eb499bf8d606d3 <=== Different Parent ID
ID : 906c6487661e9ec0
Name : aws-node-simple-http-endpoint-dev-consumer
Kind : SPAN_KIND_SERVER
Start time : 2021-07-08 07:22:20.718549504 +0000 UTC
End time : 2021-07-08 07:22:20.722462208 +0000 UTC
Status code : STATUS_CODE_UNSET
Status message :
Attributes:
-> faas.execution: STRING(85b62062-31fb-49ab-a5b0-8cba44a1315c)
-> faas.id: STRING(arn:aws:lambda:ap-southeast-2:401722391821:function:aws-node-simple-http-endpoint-dev-consumer)
-> cloud.account.id: STRING(401722391821)
Consumer lambda log
START RequestId: 327318f1-18d3-4288-8caa-bb3dcbce0f69 Version: $LATEST
--
2021/07/08 07:22:18 [collector] Launching OpenTelemetry Lambda extension, version: v0.1.0
2021-07-08T07:22:18.343Z info service/application.go:277 Starting otelcol... { "Version": "v0.1.0", "NumCPU": 2 }
2021-07-08T07:22:18.343Z info service/application.go:185 Setting up own telemetry...
2021-07-08T07:22:18.343Z info service/application.go:220 Loading configuration...
2021-07-08T07:22:18.345Z info service/application.go:236 Applying configuration...
2021-07-08T07:22:18.345Z info builder/exporters_builder.go:274 Exporter was built. { "kind": "exporter", "exporter": "logging" }
2021-07-08T07:22:18.346Z info builder/exporters_builder.go:274 Exporter was built. { "kind": "exporter", "exporter": "otlp" }
2021-07-08T07:22:18.346Z info builder/pipelines_builder.go:204 Pipeline was built. { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:22:18.346Z info builder/pipelines_builder.go:204 Pipeline was built. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:22:18.346Z info builder/receivers_builder.go:230 Receiver was built. { "kind": "receiver", "name": "otlp", "datatype": "traces" }
2021-07-08T07:22:18.347Z info builder/receivers_builder.go:230 Receiver was built. { "kind": "receiver", "name": "otlp", "datatype": "metrics" }
2021-07-08T07:22:18.347Z info service/service.go:155 Starting extensions...
2021-07-08T07:22:18.347Z info service/service.go:200 Starting exporters...
2021-07-08T07:22:18.347Z info builder/exporters_builder.go:92 Exporter is starting... { "kind": "exporter", "name": "otlp" }
2021-07-08T07:22:18.347Z info builder/exporters_builder.go:97 Exporter started. { "kind": "exporter", "name": "otlp" }
2021-07-08T07:22:18.347Z info builder/exporters_builder.go:92 Exporter is starting... { "kind": "exporter", "name": "logging" }
2021-07-08T07:22:18.347Z info builder/exporters_builder.go:97 Exporter started. { "kind": "exporter", "name": "logging" }
2021-07-08T07:22:18.347Z info service/service.go:205 Starting processors...
2021-07-08T07:22:18.347Z info builder/pipelines_builder.go:51 Pipeline is starting... { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:22:18.347Z info builder/pipelines_builder.go:62 Pipeline is started. { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:22:18.347Z info builder/pipelines_builder.go:51 Pipeline is starting... { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:22:18.347Z info builder/pipelines_builder.go:62 Pipeline is started. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:22:18.347Z info service/service.go:210 Starting receivers...
2021-07-08T07:22:18.347Z info builder/receivers_builder.go:70 Receiver is starting... { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:18.347Z info otlpreceiver/otlp.go:83 Starting GRPC server on endpoint 0.0.0.0:4317 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:18.347Z info otlpreceiver/otlp.go:145 Setting up a second GRPC listener on legacy endpoint 0.0.0.0:55680 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:18.347Z info otlpreceiver/otlp.go:83 Starting GRPC server on endpoint 0.0.0.0:55680 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:18.347Z info otlpreceiver/otlp.go:101 Starting HTTP server on endpoint 0.0.0.0:55681 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:18.347Z info builder/receivers_builder.go:75 Receiver started. { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:18.347Z info service/application.go:197 Everything is ready. Begin running and processing data.
2021/07/08 07:22:18 Registered extension ID: "f13fddea-3af0-454b-8d3c-739edf925beb"
2021/07/08 07:22:18 [collector] Register response: {
"functionName": "aws-node-simple-http-endpoint-dev-producer",
"functionVersion": "$LATEST",
"handler": "handler.consumer"
}
2021/07/08 07:22:18 [collector] Waiting for event...
Registering OpenTelemetry
EXTENSION Name: collector State: Ready Events: [INVOKE,SHUTDOWN]
2021/07/08 07:22:19 [collector] Received event: {
"eventType": "INVOKE",
"deadlineMs": 1625728945219,
"requestId": "327318f1-18d3-4288-8caa-bb3dcbce0f69",
"invokedFunctionArn": "arn:aws:lambda:ap-southeast-2:401722391821:function:aws-node-simple-http-endpoint-dev-producer",
"tracing": {
"type": "X-Amzn-Trace-Id",
"value": "Root=1-60e6a7a9-24cc736a604af2a602d84600;Parent=3d3d3a7622e60477;Sampled=1"
}
}
2021/07/08 07:22:19 [collector] Waiting for event...
2021-07-08T07:22:20.793Z 327318f1-18d3-4288-8caa-bb3dcbce0f69 INFO Status: 200
2021-07-08T07:22:20.793Z 327318f1-18d3-4288-8caa-bb3dcbce0f69 INFO { "message": "Hello, the current time is 07:22:20 GMT+0000 (Coordinated Universal Time)." }
2021-07-08T07:22:20.831Z INFO loggingexporter/logging_exporter.go:42 TracesExporter { "#spans": 3 }
2021-07-08T07:22:20.831Z DEBUG loggingexporter/logging_exporter.go:48 ResourceSpans #0
Resource labels:
-> service.name: STRING(aws-node-simple-http-endpoint-dev-producer)
-> telemetry.sdk.language: STRING(nodejs)
-> telemetry.sdk.name: STRING(opentelemetry)
-> telemetry.sdk.version: STRING(0.19.0)
-> cloud.provider: STRING(aws)
-> cloud.region: STRING(ap-southeast-2)
-> faas.name: STRING(aws-node-simple-http-endpoint-dev-producer)
-> faas.version: STRING($LATEST)
-> process.id: INT(17)
-> process.executable.name: STRING(/var/lang/bin/node)
-> process.command: STRING(/var/runtime/index.js)
-> process.command_line: STRING(/var/lang/bin/node /var/runtime/index.js)
InstrumentationLibrarySpans #0
InstrumentationLibrary @opentelemetry/instrumentation-net 0.16.0
Span #0
Trace ID : 60e6a7a924cc736a604af2a602d84600
Parent ID : 81eb76e307cb48a4
ID : 78b32ffd863269f2
Name : tcp.connect
Kind : SPAN_KIND_INTERNAL
Start time : 2021-07-08 07:22:19.377481728 +0000 UTC
End time : 2021-07-08 07:22:19.39574656 +0000 UTC
Status code : STATUS_CODE_UNSET
Status message :
Attributes:
-> net.transport: STRING(IP.TCP)
-> net.peer.name: STRING(pgzr057had.execute-api.ap-southeast-2.amazonaws.com)
-> net.peer.port: INT(443)
-> net.peer.ip: STRING(13.226.106.129)
-> net.host.ip: STRING(169.254.76.1)
-> net.host.port: INT(49788)
InstrumentationLibrarySpans #1
InstrumentationLibrary @opentelemetry/instrumentation-http 0.19.0
Span #0
Trace ID : 60e6a7a924cc736a604af2a602d84600
Parent ID : 2430b46dd43fac16
ID : 81eb76e307cb48a4
Name : HTTPS GET
Kind : SPAN_KIND_CLIENT
Start time : 2021-07-08 07:22:19.353159168 +0000 UTC
End time : 2021-07-08 07:22:20.792924928 +0000 UTC
Status code : STATUS_CODE_OK
Status message :
Attributes:
-> http.url: STRING(https://pgzr057had.execute-api.ap-southeast-2.amazonaws.com/dev/ping)
-> http.method: STRING(GET)
-> http.target: STRING(/dev/ping)
-> net.peer.name: STRING(pgzr057had.execute-api.ap-southeast-2.amazonaws.com)
-> net.peer.ip: STRING(13.226.106.129)
-> net.peer.port: INT(443)
-> http.host: STRING(pgzr057had.execute-api.ap-southeast-2.amazonaws.com:443)
-> http.response_content_length_uncompressed: INT(88)
-> http.status_code: INT(200)
-> http.status_text: STRING(OK)
-> http.flavor: STRING(1.1)
-> net.transport: STRING(IP.TCP)
InstrumentationLibrarySpans #2
InstrumentationLibrary @opentelemetry/instrumentation-aws-lambda 0.16.0
Span #0
Trace ID : 60e6a7a924cc736a604af2a602d84600
Parent ID : 7818715d2bd817ba
ID : 2430b46dd43fac16
Name : aws-node-simple-http-endpoint-dev-producer
Kind : SPAN_KIND_SERVER
Start time : 2021-07-08 07:22:19.318216704 +0000 UTC
End time : 2021-07-08 07:22:20.794181376 +0000 UTC
Status code : STATUS_CODE_UNSET
Status message :
Attributes:
-> faas.execution: STRING(327318f1-18d3-4288-8caa-bb3dcbce0f69)
-> faas.id: STRING(arn:aws:lambda:ap-southeast-2:401722391821:function:aws-node-simple-http-endpoint-dev-producer)
-> cloud.account.id: STRING(401722391821)
END RequestId: 327318f1-18d3-4288-8caa-bb3dcbce0f69
REPORT RequestId: 327318f1-18d3-4288-8caa-bb3dcbce0f69 Duration: 1615.37 ms Billed Duration: 1616 ms Memory Size: 1024 MB Max Memory Used: 160 MB Init Duration: 1104.55 ms XRAY TraceId: 1-60e6a7a9-24cc736a604af2a602d84600 SegmentId: 3d3d3a7622e60477 Sampled: true
2021-07-08T07:28:18.094Z error exporterhelper/queued_retry.go:295 Exporting failed. No more retries left. Dropping data. { "kind": "exporter", "name": "otlp", "error": "max elapsed time expired failed to push trace data via OTLP exporter: rpc error: code = DeadlineExceeded desc = context deadline exceeded", "dropped_items": 3 }
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
go.opentelemetry.io/[email protected]/exporter/exporterhelper/queued_retry.go:295
go.opentelemetry.io/collector/exporter/exporterhelper.(*tracesExporterWithObservability).send
go.opentelemetry.io/[email protected]/exporter/exporterhelper/traces.go:118
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1
go.opentelemetry.io/[email protected]/exporter/exporterhelper/queued_retry.go:152
github.com/jaegertracing/jaeger/pkg/queue.ConsumerFunc.Consume
github.com/jaegertracing/[email protected]/pkg/queue/bounded_queue.go:104
github.com/jaegertracing/jaeger/pkg/queue.(*BoundedQueue).StartConsumersWithFactory.func1
github.com/jaegertracing/[email protected]/pkg/queue/bounded_queue.go:83
2021/07/08 07:28:18 [collector] Received event: {
"eventType": "SHUTDOWN",
"deadlineMs": 1625729300093,
"requestId": "",
"invokedFunctionArn": "",
"tracing": {
"type": "",
"value": ""
}
}
2021-07-08T07:28:18.099Z info service/application.go:212 Received stop test request
2021-07-08T07:28:18.099Z info service/application.go:307 Starting shutdown...
2021-07-08T07:28:18.099Z info service/service.go:225 Stopping receivers...
2021-07-08T07:28:18.101Z info service/service.go:231 Stopping processors...
2021-07-08T07:28:18.101Z info builder/pipelines_builder.go:70 Pipeline is shutting down... { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:28:18.101Z info builder/pipelines_builder.go:76 Pipeline is shutdown. { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:28:18.101Z info builder/pipelines_builder.go:70 Pipeline is shutting down... { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:28:18.101Z info builder/pipelines_builder.go:76 Pipeline is shutdown. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:28:18.101Z info service/service.go:237 Stopping exporters...
2021-07-08T07:28:18.101Z info service/service.go:164 Stopping extensions...
2021-07-08T07:28:18.101Z info service/application.go:325 Shutdown complete.
2021/07/08 07:28:18 [collector] Received SHUTDOWN event
2021/07/08 07:28:18 [collector] Exiting
Producer lambda logs
START RequestId: 85b62062-31fb-49ab-a5b0-8cba44a1315c Version: $LATEST
--
2021/07/08 07:22:19 [collector] Launching OpenTelemetry Lambda extension, version: v0.1.0
2021-07-08T07:22:19.771Z info service/application.go:277 Starting otelcol... { "Version": "v0.1.0", "NumCPU": 2 }
2021-07-08T07:22:19.771Z info service/application.go:185 Setting up own telemetry...
2021-07-08T07:22:19.771Z info service/application.go:220 Loading configuration...
2021-07-08T07:22:19.773Z info service/application.go:236 Applying configuration...
2021-07-08T07:22:19.774Z info builder/exporters_builder.go:274 Exporter was built. { "kind": "exporter", "exporter": "logging" }
2021-07-08T07:22:19.775Z info builder/exporters_builder.go:274 Exporter was built. { "kind": "exporter", "exporter": "otlp" }
2021-07-08T07:22:19.775Z info builder/pipelines_builder.go:204 Pipeline was built. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:22:19.776Z info builder/pipelines_builder.go:204 Pipeline was built. { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:22:19.776Z info builder/receivers_builder.go:230 Receiver was built. { "kind": "receiver", "name": "otlp", "datatype": "traces" }
2021-07-08T07:22:19.776Z info builder/receivers_builder.go:230 Receiver was built. { "kind": "receiver", "name": "otlp", "datatype": "metrics" }
2021-07-08T07:22:19.776Z info service/service.go:155 Starting extensions...
2021-07-08T07:22:19.776Z info service/service.go:200 Starting exporters...
2021-07-08T07:22:19.776Z info builder/exporters_builder.go:92 Exporter is starting... { "kind": "exporter", "name": "logging" }
2021-07-08T07:22:19.776Z info builder/exporters_builder.go:97 Exporter started. { "kind": "exporter", "name": "logging" }
2021-07-08T07:22:19.776Z info builder/exporters_builder.go:92 Exporter is starting... { "kind": "exporter", "name": "otlp" }
2021-07-08T07:22:19.776Z info builder/exporters_builder.go:97 Exporter started. { "kind": "exporter", "name": "otlp" }
2021-07-08T07:22:19.776Z info service/service.go:205 Starting processors...
2021-07-08T07:22:19.776Z info builder/pipelines_builder.go:51 Pipeline is starting... { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:22:19.776Z info builder/pipelines_builder.go:62 Pipeline is started. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-07-08T07:22:19.776Z info builder/pipelines_builder.go:51 Pipeline is starting... { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:22:19.776Z info builder/pipelines_builder.go:62 Pipeline is started. { "pipeline_name": "traces", "pipeline_datatype": "traces" }
2021-07-08T07:22:19.776Z info service/service.go:210 Starting receivers...
2021-07-08T07:22:19.776Z info builder/receivers_builder.go:70 Receiver is starting... { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:19.776Z info otlpreceiver/otlp.go:83 Starting GRPC server on endpoint 0.0.0.0:4317 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:19.776Z info otlpreceiver/otlp.go:145 Setting up a second GRPC listener on legacy endpoint 0.0.0.0:55680 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:19.776Z info otlpreceiver/otlp.go:83 Starting GRPC server on endpoint 0.0.0.0:55680 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:19.776Z info otlpreceiver/otlp.go:101 Starting HTTP server on endpoint 0.0.0.0:55681 { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:19.778Z info builder/receivers_builder.go:75 Receiver started. { "kind": "receiver", "name": "otlp" }
2021-07-08T07:22:19.778Z info service/application.go:197 Everything is ready. Begin running and processing data.
2021/07/08 07:22:19 Registered extension ID: "5e1aec70-1611-4aef-9369-7b18e2dd725f"
2021/07/08 07:22:19 [collector] Register response: {
"functionName": "aws-node-simple-http-endpoint-dev-consumer",
"functionVersion": "$LATEST",
"handler": "handler.producer"
}
2021/07/08 07:22:19 [collector] Waiting for event...
Registering OpenTelemetry
EXTENSION Name: collector State: Ready Events: [INVOKE,SHUTDOWN]
2021/07/08 07:22:20 [collector] Received event: {
"eventType": "INVOKE",
"deadlineMs": 1625728946604,
"requestId": "85b62062-31fb-49ab-a5b0-8cba44a1315c",
"invokedFunctionArn": "arn:aws:lambda:ap-southeast-2:401722391821:function:aws-node-simple-http-endpoint-dev-consumer",
"tracing": {
"type": "X-Amzn-Trace-Id",
"value": "Root=1-60e6a7ab-0e1a895f1930b0f93004a373;Parent=6d5d52cd6c5c8110;Sampled=1"
}
}
2021/07/08 07:22:20 [collector] Waiting for event...
2021-07-08T07:22:20.719Z 85b62062-31fb-49ab-a5b0-8cba44a1315c INFO Event: { "resource": "/ping", "path": "/ping", "httpMethod": "GET", "headers": { "CloudFront-Forwarded-Proto": "https", "CloudFront-Is-Desktop-Viewer": "true", "CloudFront-Is-Mobile-Viewer": "false", "CloudFront-Is-SmartTV-Viewer": "false", "CloudFront-Is-Tablet-Viewer": "false", "CloudFront-Viewer-Country": "AU", "Host": "pgzr057had.execute-api.ap-southeast-2.amazonaws.com", "traceparent": "00-60e6a7a924cc736a604af2a602d84600-81eb76e307cb48a4-01", "User-Agent": "Amazon CloudFront", "Via": "1.1 a6767fff426820a624528e674b714fb3.cloudfront.net (CloudFront)", "X-Amz-Cf-Id": "_kKJ5a_w11lV-6dzkHkD4h_tRWvkMMkyz2xLC1euTguJj4L0AmmNgQ==", "X-Amzn-Trace-Id": "Root=1-60e6a7ab-0e1a895f1930b0f93004a373", "X-Forwarded-For": "3.25.229.29, 64.252.184.78", "X-Forwarded-Port": "443", "X-Forwarded-Proto": "https" }, "multiValueHeaders": { "CloudFront-Forwarded-Proto": [ "https" ], "CloudFront-Is-Desktop-Viewer": [ "true" ], "CloudFront-Is-Mobile-Viewer": [ "false" ], "CloudFront-Is-SmartTV-Viewer": [ "false" ], "CloudFront-Is-Tablet-Viewer": [ "false" ], "CloudFront-Viewer-Country": [ "AU" ], "Host": [ "pgzr057had.execute-api.ap-southeast-2.amazonaws.com" ], "traceparent": [ "00-60e6a7a924cc736a604af2a602d84600-81eb76e307cb48a4-01" ], "User-Agent": [ "Amazon CloudFront" ], "Via": [ "1.1 a6767fff426820a624528e674b714fb3.cloudfront.net (CloudFront)" ], "X-Amz-Cf-Id": [ "_kKJ5a_w11lV-6dzkHkD4h_tRWvkMMkyz2xLC1euTguJj4L0AmmNgQ==" ], "X-Amzn-Trace-Id": [ "Root=1-60e6a7ab-0e1a895f1930b0f93004a373" ], "X-Forwarded-For": [ "3.25.229.29, 64.252.184.78" ], "X-Forwarded-Port": [ "443" ], "X-Forwarded-Proto": [ "https" ] }, "queryStringParameters": null, "multiValueQueryStringParameters": null, "pathParameters": null, "stageVariables": null, "requestContext": { "resourceId": "6t8m7d", "resourcePath": "/ping", "httpMethod": "GET", "extendedRequestId": "CI8i0FLnSwMF9TA=", "requestTime": "08/Jul/2021:07:22:19 +0000", "path": "/dev/ping", "accountId": "401722391821", "protocol": "HTTP/1.1", "stage": "dev", "domainPrefix": "pgzr057had", "requestTimeEpoch": 1625728939406, "requestId": "8d9cf442-ffd7-4bb4-8a4e-696097961e8f", "identity": { "cognitoIdentityPoolId": null, "accountId": null, "cognitoIdentityId": null, "caller": null, "sourceIp": "3.25.229.29", "principalOrgId": null, "accessKey": null, "cognitoAuthenticationType": null, "cognitoAuthenticationProvider": null, "userArn": null, "userAgent": "Amazon CloudFront", "user": null }, "domainName": "pgzr057had.execute-api.ap-southeast-2.amazonaws.com", "apiId": "pgzr057had" }, "body": null, "isBase64Encoded": false }
2021-07-08T07:22:20.781Z INFO loggingexporter/logging_exporter.go:42 TracesExporter { "#spans": 1 }
2021-07-08T07:22:20.781Z DEBUG loggingexporter/logging_exporter.go:48 ResourceSpans #0
Resource labels:
-> service.name: STRING(aws-node-simple-http-endpoint-dev-consumer)
-> telemetry.sdk.language: STRING(nodejs)
-> telemetry.sdk.name: STRING(opentelemetry)
-> telemetry.sdk.version: STRING(0.19.0)
-> cloud.provider: STRING(aws)
-> cloud.region: STRING(ap-southeast-2)
-> faas.name: STRING(aws-node-simple-http-endpoint-dev-consumer)
-> faas.version: STRING($LATEST)
-> process.id: INT(18)
-> process.executable.name: STRING(/var/lang/bin/node)
-> process.command: STRING(/var/runtime/index.js)
-> process.command_line: STRING(/var/lang/bin/node /var/runtime/index.js)
InstrumentationLibrarySpans #0
InstrumentationLibrary @opentelemetry/instrumentation-aws-lambda 0.16.0
Span #0
Trace ID : 60e6a7ab0e1a895f1930b0f93004a373
Parent ID : 05eb499bf8d606d3
ID : 906c6487661e9ec0
Name : aws-node-simple-http-endpoint-dev-consumer
Kind : SPAN_KIND_SERVER
Start time : 2021-07-08 07:22:20.718549504 +0000 UTC
End time : 2021-07-08 07:22:20.722462208 +0000 UTC
Status code : STATUS_CODE_UNSET
Status message :
Attributes:
-> faas.execution: STRING(85b62062-31fb-49ab-a5b0-8cba44a1315c)
-> faas.id: STRING(arn:aws:lambda:ap-southeast-2:401722391821:function:aws-node-simple-http-endpoint-dev-consumer)
-> cloud.account.id: STRING(401722391821)
END RequestId: 85b62062-31fb-49ab-a5b0-8cba44a1315c
REPORT RequestId: 85b62062-31fb-49ab-a5b0-8cba44a1315c Duration: 181.99 ms Billed Duration: 182 ms Memory Size: 1024 MB Max Memory Used: 155 MB Init Duration: 1035.04 ms XRAY TraceId: 1-60e6a7ab-0e1a895f1930b0f93004a373 SegmentId: 6d5d52cd6c5c8110 Sampled: true
Source code
'use strict';
const request = require('request');
module.exports.producer = (event, context, callback) => {
console.log(`Event: ${JSON.stringify(event)}`)
const response = {
statusCode: 200,
body: JSON.stringify({
message: `Hello, the current time is ${new Date().toTimeString()}.`,
}),
};
callback(null, response);
};
module.exports.consumer = (event, context, callback) => {
let options = {
url: process.env.CONSUMER_API,
headers: {}
}
request.get( options, (err, res, body) => {
if (err) {
return console.log(err);
}
console.log(`Status: ${res.statusCode}`);
console.log(body);
const response_final = {
statusCode: 200,
body: JSON.stringify({
message: `The response from the producer is ${body}.`,
}),
};
callback(null, response_final);
});
};
@michaelhyatt Are you using AWS X-Ray as the backend for tracing? If not, then can you make sure active tracing is disabled for your Lambdas? The lambda instrumentation will extract the parent from the traceparent header only if active tracing is disabled. If you're not using X-Ray, then you should have no need to enable active tracing on Lambda functions.
If you are using AWS X-Ray, then because currently AWS does not support traceparent header for propagation (we are working on support for this now), you cannot use it and should use the default xray propagator setup by the aws-otel-lambda layer.
@anuraaga I am not using X-Ray, but when I am switching it off, all the traces appear as non-sampled and I am not seeing any traces generated:
Traceparent header that is received by the producer
lambda with 00
indicating no sampling.
"traceparent": "00-60e6ac5a4a2a3d036d789ca833fc13aa-54a6f39fe0a24cbc-00",
What am I missing to get the sampling turned on?
Thanks @michaelhyatt - it seems to be a bug in the instrumentation that prevents a lambda from starting a trace, I have filed https://github.com/aws-observability/aws-otel-lambda/issues/118 to fix it.
However, I noticed that in the logs you showed the span in consumer
seems to have a Parent ID - does it mean requests to consumer
include a traceparent
? If so, then even now as long as X-Ray is disabled, the sampling decision of that header should be respected.
@anuraaga I am not sending a traceparent
header myself, this is most likely inserted by X-Ray since my first example had Active tracing turned on.
@anuraaga forgot to answer your original question. No, I am not injecting any external traceparent
headers, I was asking about forcing sampling on in the code itself. Will have to dig around for examples, but if you have one, please post it here.
@michaelhyatt Ah yeah that would explain it. So once the bug is fixed it should be ok without any special configuration. Unfortunately I'm not aware of a good workaround in the meantime unless you create your own span without a parent.
@anuraaga can confirm that injecting my own traceparent sampled: true
and Active tracing switched off produces an end to end span with correctly populated parents.
This issue is stale because it has been open 90 days with no activity. If you want to keep this issue open, please just leave a comment below and auto-close will be canceled
This issue was closed because it has been marked as stale for 30 days with no activity.