aws-otel-lambda
aws-otel-lambda copied to clipboard
Not able to export metrics to awsprometheus
Hello , I am trying to export metrics from lambda function. I have followed the steps as stated . (Built the downstream layer and deployed the intergration-tests folder using terraform). I am using nodejs sample app and layer. I am still not able to get metrics in aws prometheus and neither otlphttp. Any idea what might have been wrong ?
My config file is as follows:
otlp:
protocols:
grpc:
http:
exporters:
logging:
loglevel: debug
awsprometheusremotewrite:
endpoint: https://aps-workspaces.us-east-2.amazonaws.com/workspaces/<ws>/api/v1/remote_write
aws_auth:
region: us-east-2
service: 'aps'
otlphttp:
endpoint: <endpoint>
service:
pipelines:
traces:
receivers: [otlp]
exporters: [awsprometheusremotewrite]
metrics:
receivers: [otlp]
exporters: [ awsprometheusremotewrite]
Logs of lambda function
START RequestId: bcfbaa09-e578-46dc-a4f9-90da2c98f2ba Version: $LATEST
--
2021/10/15 12:20:11 [collector] Launching OpenTelemetry Lambda extension, version: v0.1.0
2021-10-15T12:20:11.362Z info service/collector.go:176 Applying configuration...
2021-10-15T12:20:11.362Z info builder/exporters_builder.go:227 Ignoring exporter as it is not used by any pipeline { "kind": "exporter", "name": "logging" }
2021-10-15T12:20:11.363Z info builder/exporters_builder.go:265 Exporter was built. { "kind": "exporter", "name": "awsprometheusremotewrite" }
2021-10-15T12:20:11.363Z info builder/exporters_builder.go:227 Ignoring exporter as it is not used by any pipeline { "kind": "exporter", "name": "otlphttp" }
2021-10-15T12:20:11.363Z info builder/pipelines_builder.go:214 Pipeline was built. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-10-15T12:20:11.363Z info builder/receivers_builder.go:228 Receiver was built. { "kind": "receiver", "name": "otlp", "datatype": "metrics" }
2021-10-15T12:20:11.363Z info service/service.go:101 Starting extensions...
2021-10-15T12:20:11.363Z info service/service.go:106 Starting exporters...
2021-10-15T12:20:11.363Z info builder/exporters_builder.go:92 Exporter is starting... { "kind": "exporter", "name": "logging" }
2021-10-15T12:20:11.363Z info builder/exporters_builder.go:97 Exporter started. { "kind": "exporter", "name": "logging" }
2021-10-15T12:20:11.363Z info builder/exporters_builder.go:92 Exporter is starting... { "kind": "exporter", "name": "awsprometheusremotewrite" }
2021-10-15T12:20:11.365Z info builder/exporters_builder.go:97 Exporter started. { "kind": "exporter", "name": "awsprometheusremotewrite" }
2021-10-15T12:20:11.365Z info builder/exporters_builder.go:92 Exporter is starting... { "kind": "exporter", "name": "otlphttp" }
2021-10-15T12:20:11.365Z info builder/exporters_builder.go:97 Exporter started. { "kind": "exporter", "name": "otlphttp" }
2021-10-15T12:20:11.365Z info service/service.go:111 Starting processors...
2021-10-15T12:20:11.365Z info builder/pipelines_builder.go:51 Pipeline is starting... { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-10-15T12:20:11.365Z info builder/pipelines_builder.go:62 Pipeline is started. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-10-15T12:20:11.365Z info service/service.go:116 Starting receivers...
2021-10-15T12:20:11.365Z info builder/receivers_builder.go:70 Receiver is starting... { "kind": "receiver", "name": "otlp" }
2021-10-15T12:20:11.365Z info otlpreceiver/otlp.go:74 Starting GRPC server on endpoint 0.0.0.0:4317 { "kind": "receiver", "name": "otlp" }
2021-10-15T12:20:11.365Z info otlpreceiver/otlp.go:92 Starting HTTP server on endpoint 0.0.0.0:4318 { "kind": "receiver", "name": "otlp" }
2021-10-15T12:20:11.365Z info otlpreceiver/otlp.go:147 Setting up a second HTTP listener on legacy endpoint 0.0.0.0:55681 { "kind": "receiver", "name": "otlp" }
2021-10-15T12:20:11.365Z info otlpreceiver/otlp.go:92 Starting HTTP server on endpoint 0.0.0.0:55681 { "kind": "receiver", "name": "otlp" }
2021-10-15T12:20:11.365Z info builder/receivers_builder.go:75 Receiver started. { "kind": "receiver", "name": "otlp" }
2021-10-15T12:20:11.365Z info service/telemetry.go:65 Setting up own telemetry...
2021-10-15T12:20:11.366Z info service/telemetry.go:113 Serving Prometheus metrics { "address": ":8888", "level": 0, "service.instance.id": "a800ee06-64f7-4085-8868-be2240376ff1" }
2021-10-15T12:20:11.366Z info service/collector.go:230 Starting otelcol... { "Version": "v0.1.0", "NumCPU": 2 }
2021-10-15T12:20:11.366Z info service/collector.go:134 Everything is ready. Begin running and processing data.
2021/10/15 12:20:11 Registered extension ID: "f58fa35e-d869-45bc-b661-e647430bcc00"
2021/10/15 12:20:11 [collector] Register response: {
"functionName": "hello-nodejs-awssdk",
"functionVersion": "$LATEST",
"handler": "index.handler"
}
2021/10/15 12:20:11 [collector] Waiting for event...
Registering OpenTelemetry
EXTENSION Name: collector State: Ready Events: [INVOKE,SHUTDOWN]
2021/10/15 12:20:12 [collector] Received event: {
"eventType": "INVOKE",
"deadlineMs": 1634300432688,
"requestId": "bcfbaa09-e578-46dc-a4f9-90da2c98f2ba",
"invokedFunctionArn": "arn:aws:lambda:us-east-2:739457818465:function:hello-nodejs-awssdk",
"tracing": {
"type": "X-Amzn-Trace-Id",
"value": "Root=1-616971fa-0298ca0d6bb48b175b16932f;Parent=2e6615f64704d786;Sampled=1"
}
}
2021/10/15 12:20:12 [collector] Waiting for event...
2021-10-15T12:20:13.091Z bcfbaa09-e578-46dc-a4f9-90da2c98f2ba INFO Serving lambda request.
END RequestId: bcfbaa09-e578-46dc-a4f9-90da2c98f2ba
REPORT RequestId: bcfbaa09-e578-46dc-a4f9-90da2c98f2ba Duration: 855.81 ms Billed Duration: 856 ms Memory Size: 384 MB Max Memory Used: 137 MB Init Duration: 1505.31 ms XRAY TraceId: 1-616971fa-0298ca0d6bb48b175b16932f SegmentId: 2e6615f64704d786 Sampled: true
START RequestId: ca6700ea-c227-40c6-a197-5624af1b8930 Version: $LATEST
2021/10/15 12:20:35 [collector] Received event: {
"eventType": "INVOKE",
"deadlineMs": 1634300455007,
"requestId": "ca6700ea-c227-40c6-a197-5624af1b8930",
"invokedFunctionArn": "arn:aws:lambda:us-east-2:739457818465:function:hello-nodejs-awssdk",
"tracing": {
"type": "X-Amzn-Trace-Id",
"value": "Root=1-61697212-2a26451a312587f4167bb880;Parent=0ca39ae26ef9faee;Sampled=1"
}
}
2021/10/15 12:20:35 [collector] Waiting for event...
2021-10-15T12:20:35.008Z ca6700ea-c227-40c6-a197-5624af1b8930 INFO Serving lambda request.
END RequestId: ca6700ea-c227-40c6-a197-5624af1b8930
REPORT RequestId: ca6700ea-c227-40c6-a197-5624af1b8930 Duration: 117.02 ms Billed Duration: 118 ms Memory Size: 384 MB Max Memory Used: 138 MB XRAY TraceId: 1-61697212-2a26451a312587f4167bb880 SegmentId: 0ca39ae26ef9faee Sampled: true
2021/10/15 12:26:10 [collector] Received event: {
"eventType": "SHUTDOWN",
"deadlineMs": 1634300772893,
"requestId": "",
"invokedFunctionArn": "",
"tracing": {
"type": "",
"value": ""
}
}
2021-10-15T12:26:10.907Z info service/collector.go:150 Received shutdown request
2021-10-15T12:26:10.907Z info service/collector.go:242 Starting shutdown...
2021-10-15T12:26:10.907Z info service/service.go:136 Stopping receivers...
2021-10-15T12:26:10.909Z info service/service.go:141 Stopping processors...
2021-10-15T12:26:10.909Z info builder/pipelines_builder.go:70 Pipeline is shutting down... { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-10-15T12:26:10.909Z info builder/pipelines_builder.go:76 Pipeline is shutdown. { "pipeline_name": "metrics", "pipeline_datatype": "metrics" }
2021-10-15T12:26:10.909Z info service/service.go:146 Stopping exporters...
2021-10-15T12:26:10.909Z info service/service.go:151 Stopping extensions...
2021-10-15T12:26:10.909Z info service/collector.go:258 Shutdown complete.
2021/10/15 12:26:10 [collector] Received SHUTDOWN event
2021/10/15 12:26:10 [collector] Exiting
@alolita @anuraaga Would appreciate your help on this. I am basically not able to export metrics from nodejs sample app provided to AMP. I have tried deploying both upstream and downstream versions. From the documentation, I gather that it works for java based sample app. Is it a work in progress for nodejs ?
Hi @vishalsaugat - what does your code look like for collecting metrics in the nodejs app? One important thing is to make sure to force flush the metrics - FWIU, nodejs doesn't currently support this though so I guess your app isn't doing it. Metrics is still in alpha state so different languages are in quite a different state with respect to their support, my understanding is Java and Go are the most mature. We currently only officially support export of metrics to AMP with Lambda using Java, and I think you may need to wait a bit for upstream support in the other languages to get more mature.
Echoing what @anuraaga said, it looks like the Prometheus Metric Exporter is still an experimental package on upstream OTel JS. That README.md gives some examples on how to run a JS app using the exporter so maybe, if it's possible, it would help to verify you can get traces from a JS app to the prometheus backend using OTel JS and without Lambda first?
@vishalsaugat were you able to solve your issue?
@NathanielRN we are seeing same issue with Adot lawyer for python with auto instrument. Is this the case with python as well??
This issue is stale because it has been open 90 days with no activity. If you want to keep this issue open, please just leave a comment below and auto-close will be canceled
Is there more information/development on this issue?
Can you please verify if this issue continues to exist in the new lambda layer release. Force flush meter provider was introduced in opentelemetry-js-contrib by this PR. This got passed down to the latest ADOT lambda layer for metrics
This issue is stale because it has been open 90 days with no activity. If you want to keep this issue open, please just leave a comment below and auto-close will be canceled
This issue was closed because it has been marked as stale for 30 days with no activity.