Bug: Tracer escape hatch does not expose full AWSXRay api
Expected Behavior
Based on the documentation
Escape hatch mechanism
You can use tracer.provider attribute to access all methods provided by the AWS X-Ray SDK.
But this is not true. It only re-exposes a subset of methods.
see https://github.com/aws-powertools/powertools-lambda-typescript/blob/main/packages/tracer/src/provider/ProviderService.ts
In my case I wanted to overwrite the setStreamingThreshold() method which is not implemented.
Current Behavior
An error is thrown because the method is not implemented
Code snippet
const tracer = new Tracer({ serviceName: "serverlessAirline" });
tracer.provider.setStreamingThreshold(1);
produces an error.
{
"errorType":"TypeError",
"errorMessage":"tracer.provider.setStreamingThreshold is not a function"
// ....
}
### Steps to Reproduce
see code snippet
### Possible Solution
1. update docs
2. implement/proxy all methods
### Powertools for AWS Lambda (TypeScript) version
latest
### AWS Lambda function runtime
20.x
### Packaging format used
Lambda Layers
### Execution logs
```Shell
{
"errorType":"TypeError","errorMessage":"tracer.provider.setStreamingThreshold is not a function","trace":["TypeError: tracer.provider.setStreamingThreshold is not a function"," at Object.<anonymous> (/var/task/index.js:6:17)"," at Module._compile (node:internal/modules/cjs/loader:1358:14)"," at Module._extensions..js (node:internal/modules/cjs/loader:1416:10)"," at Module.load (node:internal/modules/cjs/loader:1208:32)"," at Module._load (node:internal/modules/cjs/loader:1024:12)"," at Module.require (node:internal/modules/cjs/loader:1233:19)"," at require (node:internal/modules/helpers:179:18)"," at _tryRequireFile (file:///var/runtime/index.mjs:1002:37)"," at _tryRequire (file:///var/runtime/index.mjs:1052:25)"," at _loadUserApp (file:///var/runtime/index.mjs:1081:22)"]}
Hi @RaphaelManke, we'll get the docs changed and clarify that we expose only some methods.
Regarding the one you want to overwrite, when working on Lambda the streaming threshold must be set to 0, meaning a segment is sent to X-Ray as soon as it's closed.
This is to avoid data loss.
I should add that if you really want to change the threshold, you can import the function directly from aws-xray-sdk-core and it'll take effect.
Thanks for the fast feedback 😃
Yes I noticed
const AWSXRay = require("aws-xray-sdk-core");
AWSXRay.setStreamingThreshold(1);
also works 😃 maybe that could be mentioned in the new escape hatch docs.
In my Usecase I want to redirect the xray udp packages to an lambda extension. This is not working as expected right now and then I found in the xray sdk doc https://docs.aws.amazon.com/xray-sdk-for-nodejs/latest/reference/segments_segment_utils.js.html that the default is 100 and so I tried to lower that value.
You referenced link shows that the threshold is actually set to 0 which I can confirm in the logs. So my problem seems to be somewhere else (or maybe I just oversaw something and it's actually working). But that not relevant for this issue 😆
Hey @RaphaelManke and @dreamorosi! My 2 cents here..
In Python we enforce threshold to be zero, so we make sure we are overriding the SDK default value which is 30. But in NodeJs looks like that the SDK does this automatically when running in Lambda, which is cool.
@RaphaelManke just out of curiosity. In your case, is this not working to configure/send the batch size, or are you having trouble configuring the redirect of the xray agent's default IP/port?
@leandrodamascena I am building a POC that sends xray data to a sqs queue using the lambda extension api Based on that https://github.com/aws-samples/aws-lambda-extensions/tree/main/nodejs-example-telemetry-api-extension
I experienced some timing issue that the message weren't yet passed to the "offload queue".
For now a delay fixed it.
The final POC solution should look like this
But this is all not related to this issue 😅 I am happy to discuss that on discord (either BelieveInServerless -> Observability or Powertools)
⚠️ COMMENT VISIBILITY WARNING ⚠️
This issue is now closed. Please be mindful that future comments are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.
This is now released under v2.8.0 version!