dd-trace-java icon indicating copy to clipboard operation
dd-trace-java copied to clipboard

Datadog APM client for Java

Results 483 dd-trace-java issues
Sort by recently updated
recently updated
newest added

Hi DD team, we set up DD agent in our production env last week and we have got 6 random JVM crashes since then. - dd-java-agent: v1.25.1 - JVM: Corretto-17.0.5.8.1...

type: bug
comp: profiling

Hey, I see some questions related to tracing on my app. I have 2 services - Service1 is using Spring Boot 3.1.6 with SL4J/Logback. I use DataDog as a Java...

Could anyone confirm that the dd-agent works with the latest version `mongodb-driver`? In my team we are currently using the latest agent version and noticed that Mongo traces are missing...

# What Does This Do POC on refactor of product initialization. # Motivation * Less reflection. * Further encapsulation of the initialization logic of each product. # Additional Notes *...

tag: do not merge
comp: core
tag: no release notes
type: refactoring

# What Does This Do Flush Data Streams Monitoring stats when tracer is stopped. For example, when Lambda function exits. # Motivation Data Streams stats were missing when executing Lambda...

type: enhancement
comp: data streams

Hi all, need a hint - I'm working on a setup where two spring boot services are talking through Kafka. `service A` emits an `event1` that is consumed by `service...

We set `dd.trace.db.client.split-by-instance=true` to [assign the database instance name](https://docs.datadoghq.com/tracing/trace_collection/library_config/java/) as the service name of the DB spans. But there is no configuration to customize the names. It would be nice...

When using OpenTelemetry instrumentation, I'd expect that after making a propagated context current that new spans have the context as parent automatically, as described in the [OpenTelemetry documentation](https://opentelemetry.io/docs/instrumentation/java/manual/#context-propagation). However, I...

type: feature request
inst: opentelemetry

When a `Controller` with `suspend` handler function is used, and there's a `WebFilter` that uses `mono {}` builder function from Kotlin, then traces from `WebFilter` are not propagated to the...

# What Does This Do Remove details tag for `spark.stage` and `spark.sql` spans # Motivation This field is very heavy, since it contains the full stacktrace that triggered the spark...

inst: apache spark