opentelemetry-python
opentelemetry-python copied to clipboard
Support JSON over HTTP in OTLP exporter
The spec for OTLP exporter recommends supporting both protobufs over grpc and json over http for exporting telemetry via the OTLP exporter https://github.com/open-telemetry/opentelemetry-specification/pull/699/files
Describe the solution you'd like Currently, the implementation only supports gRPC, this feature request is to support JSON over HTTP
@codeboten I'll pick this up if it is not done already.
Ah I just noticed that there is one more issue #1106 that is related to this. Maybe it's better if these are completed sequentially.
pls assign this to me.
Hello @codeboten , @lonewolf3739 ,
My idea is to follow the otlp getting-started example and create a simple span for debugging.
As part of the fix i wanted to do 2 things:
- test the output of json encoded protobuf
- simulate all kinds of errors in the spec, like 400/500, do you suggest setting up a local collector? or is there any easy way to debug these? if a local collector needs to be setup , pls point me to the docs.
3 . i had following questions from the spec: https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/protocol/otlp.md#otlphttp
OTLP/HTTP uses HTTP POST requests to send telemetry data from clients to servers. Implementations MAY use HTTP/1.1 or HTTP/2 transports. Implementations that use HTTP/2 transport SHOULD fallback to HTTP/1.1 transport if HTTP/2 connection cannot be established.
how do we implement http2? I did not find any params in request.post to configure for http2. how should be the fall back mechanism?
https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/protocol/otlp.md#otlphttp-request
The client MAY gzip the content and in that case MUST include "Content-Encoding: gzip" request header. The client MAY include "Accept-Encoding: gzip" request header if it can receive gzip-encoded responses.
I'm assuming that it should be a configurable param in the constructor of exporter. pls let me know if thats not correct.
https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/protocol/otlp.md#success
On success the server MUST respond with HTTP 200 OK. Response body MUST be Protobuf-encoded ExportTraceServiceResponse message for traces, ExportMetricsServiceResponse message for metrics and ExportLogsServiceResponse message for logs.
currently the ExportTraceServiceResponse message seems to be empty. What needs to be logged just a success message?
https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/protocol/otlp.md#failures
the processing of the request fails the server MUST respond with appropriate HTTP 4xx or HTTP 5xx status code. See sections below for more details about specific failure cases and HTTP status codes that should be used.
Response body for all HTTP 4xx and HTTP 5xx responses MUST be a Protobuf-encoded Status message that describes the problem.
This specification does not use Status.code field and the server MAY omit Status.code field. The clients are not expected to alter their behavior based on Status.code field but MAY record it for troubleshooting purposes.
The Status.message field SHOULD contain a developer-facing error message as defined in Status message schema.
The server MAY include Status.details field with additional details. Read below about what this field can contain in each specific failure case.
For all the errors I'm assuming that this just needs to be logged.
Hello @dmarar ,
Could you please let me know that are you still working on that issue?
Hello @dmarar ,
Could you please let me know that are you still working on that issue? Hello @Shihao-Zhong Sorry for the late reply. I couldn't start much on this because of some personal emergency. Please feel free to work on it incase you want to.
Thanks!
Hi, Is there any update? I'm really looking for this solution
Hi -- there are work items that have to happen first and I am working on those, so this feature is in progress. I can't promise an ETA at this point yet though. Will update this ticket when that firms up.
@pmcollins can you give any insight into what is blocking this work? I'd also be happy to contribute to getting this working.
Hi @MikeGoldsmith, I'm happy to hand this over to you if you are interested. IMO the task is more complicated than it needs to be because of some tech debt in the exporters that I wanted to address first, but there's no reason that it can't be addressed later.
Sure, please assign to me and I'll try to make progress on it this week. Thanks 👍🏻
I've created a draft for adding the OTLP JSON exporter, just traces at first. For anyone interested, please take a look.
Unassigning @MikeGoldsmith due to time constraints. Thanks for working on this. This feature is still open for anyone to pickup!