opentelemetry-lambda icon indicating copy to clipboard operation
opentelemetry-lambda copied to clipboard

New Python lambda layers do not capture any data

Open pranavmarla opened this issue 1 year ago • 1 comments

Describe the bug I have instrumented a simple Python lambda function with the new OTEL collector lambda layer and the new Python instrumentation layer. The collector is configured to export the metrics and traces to the lambda logs (via the logging exporter) as well as an external backend (via the otlphttp exporter).

Unfortunately, I do not see any metrics or trace data in either the lambda logs or the external backend. If I replace the new OTEL lambda layers with the old Amazon ADOT lambda layer, I am able to see traces in the lambda logs and both metrics and traces in the external backend. This suggests that the issue is with the new lambda layers.

Steps to reproduce Create a (Python 3.9, x86_64) lambda with the following files:

  • lambda_function.py: Contains the lambda code
from urllib.request import urlopen
import json

def lambda_handler(event, context):

    with urlopen('http://postman-echo.com/get') as response:
        response_bytes = response.read()
        response_dict = json.loads(response_bytes)
    
    return response_dict
  • collector.yaml: Contains the OTEL collector config
receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 'localhost:4317'
      http:
        endpoint: 'localhost:4318'

exporters:
  logging:
    verbosity: detailed
  otlphttp/metrics:
    endpoint: 'xxx'
    tls:
      insecure: false
      insecure_skip_verify: true
  otlphttp/traces:
    endpoint: 'xxx'
    tls:
      insecure: false
      insecure_skip_verify: true

service:
  pipelines:
    metrics:
      receivers: [otlp]
      exporters: [logging, otlphttp/metrics]
    traces:
      receivers: [otlp]
      exporters: [logging, otlphttp/traces]
  telemetry:
    metrics:
      address: localhost:8888

Add the following env variable:

  • OPENTELEMETRY_COLLECTOR_CONFIG_FILE: /var/task/collector.yaml

Add the following new OTEL lambda layers:

  • arn:aws:lambda:us-east-1:184161586896:layer:opentelemetry-collector-amd64-0_1_0:1
  • arn:aws:lambda:us-east-1:184161586896:layer:opentelemetry-python-0_1_0:1

Increase lambda timeout past the default 3 sec (I increased to 9 sec as per this AWS blog post).

What did you expect to see?

  • Metrics and traces in the lambda logs (i.e. in the CloudWatch log group corresponding to that lambda)
  • Metrics and traces in the external backend

What did you see instead?

  • No metrics or traces in the lambda logs
  • No metrics or traces in the external backend

What version of collector/language SDK version did you use?

What language layer did you use? Python

Additional context With the new OTEL lambda layers, these are the lambda logs I get -- as noted, they do not contain any metrics or traces. Also, the external backend (not shown) does not contain any metrics or traces:

INIT_START Runtime Version: python:3.9.v23	Runtime Version ARN: arn:aws:lambda:us-east-1::runtime:xxx
{
    "level": "info",
    "ts": 1687530337.8501613,
    "msg": "Launching OpenTelemetry Lambda extension",
    "version": "v0.76.0"
}

{
    "level": "info",
    "ts": 1687530337.8537326,
    "logger": "telemetryAPI.Listener",
    "msg": "Listening for requests",
    "address": "sandbox:53612"
}

{
    "level": "info",
    "ts": 1687530337.8551075,
    "logger": "telemetryAPI.Client",
    "msg": "Subscribing",
    "baseURL": "http://127.0.0.1:9001/2022-07-01/telemetry"
}

TELEMETRY	Name: collector	State: Subscribed	Types: [Platform]
{
    "level": "info",
    "ts": 1687530337.8576913,
    "logger": "telemetryAPI.Client",
    "msg": "Subscription success",
    "response": "\"OK\""
}

{
    "level": "info",
    "ts": 1687530337.857857,
    "logger": "NewCollector",
    "msg": "Using config URI from environment",
    "uri": "/var/task/collector.yaml"
}

{
    "level": "info",
    "ts": 1687530337.8700795,
    "caller": "service/telemetry.go:92",
    "msg": "Setting up own telemetry..."
}

{
    "level": "Basic",
    "ts": 1687530337.8705246,
    "caller": "service/telemetry.go:149",
    "msg": "Serving Prometheus metrics",
    "address": "localhost:8888"
}

{
    "level": "info",
    "ts": 1687530337.8706422,
    "caller": "[email protected]/exporter.go:286",
    "msg": "Development component. May change in the future.",
    "kind": "exporter",
    "data_type": "metrics",
    "name": "logging"
}

{
    "level": "info",
    "ts": 1687530337.9061983,
    "caller": "[email protected]/exporter.go:286",
    "msg": "Development component. May change in the future.",
    "kind": "exporter",
    "data_type": "traces",
    "name": "logging"
}

{
    "level": "info",
    "ts": 1687530337.9088805,
    "caller": "service/service.go:129",
    "msg": "Starting otelcol-lambda...",
    "Version": "v0.76.0",
    "NumCPU": 2
}

{
    "level": "info",
    "ts": 1687530337.9090724,
    "caller": "extensions/extensions.go:41",
    "msg": "Starting extensions..."
}

{
    "level": "info",
    "ts": 1687530337.9092343,
    "caller": "[email protected]/otlp.go:94",
    "msg": "Starting GRPC server",
    "kind": "receiver",
    "name": "otlp",
    "data_type": "metrics",
    "endpoint": "localhost:4317"
}

{
    "level": "info",
    "ts": 1687530337.909417,
    "caller": "[email protected]/otlp.go:112",
    "msg": "Starting HTTP server",
    "kind": "receiver",
    "name": "otlp",
    "data_type": "metrics",
    "endpoint": "localhost:4318"
}

{
    "level": "info",
    "ts": 1687530337.9095626,
    "caller": "service/service.go:146",
    "msg": "Everything is ready. Begin running and processing data."
}

EXTENSION	Name: collector	State: Ready	Events: [INVOKE, SHUTDOWN]
START RequestId: 829cd60c-8b84-4c46-a327-eda7c5e26da1 Version: $LATEST
END RequestId: 829cd60c-8b84-4c46-a327-eda7c5e26da1
REPORT RequestId: 829cd60c-8b84-4c46-a327-eda7c5e26da1	Duration: 305.42 ms	Billed Duration: 306 ms	Memory Size: 128 MB	Max Memory Used: 82 MB	Init Duration: 377.18 ms	
{
    "level": "info",
    "ts": 1687530686.3654792,
    "logger": "lifecycle.manager",
    "msg": "Received SHUTDOWN event"
}

{
    "level": "info",
    "ts": 1687530686.3656828,
    "logger": "telemetryAPI.Listener",
    "msg": "HTTP Server closed:",
    "error": "http: Server closed"
}

{
    "level": "info",
    "ts": 1687530686.3658288,
    "caller": "otelcol/collector.go:246",
    "msg": "Received shutdown request"
}

{
    "level": "info",
    "ts": 1687530686.3849585,
    "caller": "service/service.go:155",
    "msg": "Starting shutdown..."
}

{
    "level": "info",
    "ts": 1687530686.3853285,
    "caller": "extensions/extensions.go:55",
    "msg": "Stopping extensions..."
}

{
    "level": "info",
    "ts": 1687530686.3853457,
    "caller": "service/service.go:169",
    "msg": "Shutdown complete."
}

{
    "level": "info",
    "ts": 1687530686.3854933,
    "msg": "done"
}

Instead, if you replace the two OTEL lambda layers with the following ADOT lambda layer: arn:aws:lambda:us-east-1:901920570463:layer:aws-otel-python-amd64-ver-1-17-0:1 and add the following env variable: AWS_LAMBDA_EXEC_WRAPPER: /opt/otel-instrument

Then I see traces in the lambda logs, and both metrics and traces in the external backend (not shown here):

INIT_START Runtime Version: python:3.9.v23	Runtime Version ARN: arn:aws:lambda:us-east-1::runtime:xxx
{
    "level": "info",
    "ts": 1687533950.4487019,
    "msg": "Launching OpenTelemetry Lambda extension",
    "version": "v0.28.0"
}

{
    "level": "info",
    "ts": 1687533950.4531193,
    "logger": "telemetryAPI.Listener",
    "msg": "Listening for requests",
    "address": "sandbox:53612"
}

{
    "level": "info",
    "ts": 1687533950.4532046,
    "logger": "telemetryAPI.Client",
    "msg": "Subscribing",
    "baseURL": "http://127.0.0.1:9001/2022-07-01/telemetry"
}

TELEMETRY	Name: collector	State: Subscribed	Types: [Platform]
{
    "level": "info",
    "ts": 1687533950.4559968,
    "logger": "telemetryAPI.Client",
    "msg": "Subscription success",
    "response": "\"OK\""
}

{
    "level": "info",
    "ts": 1687533950.4560287,
    "logger": "NewCollector",
    "msg": "Using config URI from environment",
    "uri": "/var/task/collector.yaml"
}

{
    "level": "info",
    "ts": 1687533950.4672554,
    "caller": "service/telemetry.go:90",
    "msg": "Setting up own telemetry..."
}

{
    "level": "Basic",
    "ts": 1687533950.4681895,
    "caller": "service/telemetry.go:116",
    "msg": "Serving Prometheus metrics",
    "address": "localhost:8888"
}

{
    "level": "info",
    "ts": 1687533950.4682777,
    "caller": "[email protected]/exporter.go:286",
    "msg": "Development component. May change in the future.",
    "kind": "exporter",
    "data_type": "traces",
    "name": "logging"
}

{
    "level": "info",
    "ts": 1687533950.472519,
    "caller": "[email protected]/exporter.go:286",
    "msg": "Development component. May change in the future.",
    "kind": "exporter",
    "data_type": "metrics",
    "name": "logging"
}

{
    "level": "info",
    "ts": 1687533950.4890018,
    "caller": "service/service.go:129",
    "msg": "Starting aws-otel-lambda...",
    "Version": "v0.28.0",
    "NumCPU": 2
}

{
    "level": "info",
    "ts": 1687533950.4892137,
    "caller": "extensions/extensions.go:41",
    "msg": "Starting extensions..."
}

{
    "level": "info",
    "ts": 1687533950.490859,
    "caller": "[email protected]/otlp.go:94",
    "msg": "Starting GRPC server",
    "kind": "receiver",
    "name": "otlp",
    "data_type": "metrics",
    "endpoint": "localhost:4317"
}

{
    "level": "info",
    "ts": 1687533950.4911032,
    "caller": "[email protected]/otlp.go:112",
    "msg": "Starting HTTP server",
    "kind": "receiver",
    "name": "otlp",
    "data_type": "metrics",
    "endpoint": "localhost:4318"
}

{
    "level": "info",
    "ts": 1687533950.491234,
    "caller": "service/service.go:146",
    "msg": "Everything is ready. Begin running and processing data."
}

EXTENSION	Name: collector	State: Ready	Events: [INVOKE, SHUTDOWN]
START RequestId: 9ac3d6a3-b727-4486-9f06-b05d65fecf7d Version: $LATEST
{
    "level": "info",
    "ts": 1687533951.8895519,
    "msg": "TracesExporter",
    "kind": "exporter",
    "data_type": "traces",
    "name": "logging",
    "#spans": 1
}

{
    "level": "info",
    "ts": 1687533951.8896523,
    "msg": "ResourceSpans #0\nResource SchemaURL: \nResource attributes:\n     -> telemetry.sdk.language: Str(python)\n     -> telemetry.sdk.name: Str(opentelemetry)\n     -> telemetry.sdk.version: Str(1.17.0)\n     -> cloud.region: Str(us-east-1)\n     -> cloud.provider: Str(aws)\n     -> faas.name: Str(xxx)\n     -> faas.version: Str($LATEST)\n     -> faas.instance: Str(2023/06/23/[$LATEST]xxx)\n     -> service.name: Str(xxx)\n     -> telemetry.auto.version: Str(0.38b0)\nScopeSpans #0\nScopeSpans SchemaURL: \nInstrumentationScope opentelemetry.instrumentation.aws_lambda 0.38b0\nSpan #0\n    Trace ID       : 9448308f423c682768407c60d64be8e9\n    Parent ID      : \n    ID             : 15bb2fda670ac79b\n    Name           : lambda_function.lambda_handler\n    Kind           : Server\n    Start time     : 2023-06-23 15:25:51.473425744 +0000 UTC\n    End time       : 2023-06-23 15:25:51.827264627 +0000 UTC\n    Status code    : Unset\n    Status message : \nAttributes:\n     -> faas.id: Str(arn:aws:lambda:us-east-1:xxx:function:xxx)\n     -> faas.execution: Str(9ac3d6a3-b727-4486-9f06-b05d65fecf7d)\n",
    "kind": "exporter",
    "data_type": "traces",
    "name": "logging"
}

END RequestId: 9ac3d6a3-b727-4486-9f06-b05d65fecf7d
REPORT RequestId: 9ac3d6a3-b727-4486-9f06-b05d65fecf7d	Duration: 615.12 ms	Billed Duration: 616 ms	Memory Size: 128 MB	Max Memory Used: 110 MB	Init Duration: 1249.72 ms

pranavmarla avatar Jun 23 '23 15:06 pranavmarla

It works for me, did you follow the instruction here ? https://opentelemetry.io/docs/faas/lambda-auto-instrument/ You still need to set AWS_LAMBDA_EXEC_WRAPPER=/opt/otel-instrument

shuwpan avatar Aug 01 '23 22:08 shuwpan