opentelemetry-demo icon indicating copy to clipboard operation
opentelemetry-demo copied to clipboard

Enable Jaeger Service Performance Management (SPM) experimental feature

Open ramonguiu opened this issue 3 years ago • 2 comments

Configure Jaeger and the OTel Collector to enable Jaeger SPM experimental feature

ramonguiu avatar Jun 25 '22 15:06 ramonguiu

I am not 100% sure we should merge it. The spanmetrics processor crashes from time to time and the otel collector needs to restart. I am not sure what's causing it.

panic: runtime error: index out of range [17] with length 17 [recovered]
	panic: runtime error: index out of range [17] with length 17

goroutine 172 [running]:
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End.func1()
	go.opentelemetry.io/otel/[email protected]/trace/span.go:359 +0x2a
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End(0xc001d56180, {0x0, 0x0, 0x403be6?})
	go.opentelemetry.io/otel/[email protected]/trace/span.go:398 +0x8dd
panic({0x483d560, 0xc001d7c390})
	runtime/panic.go:838 +0x207
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).updateLatencyMetrics(...)
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:442
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).aggregateMetricsForSpan(0xc000574780, {0xc001c169bb, 0x5}, {0x0?}, {0x8?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:395 +0x466
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).aggregateMetricsForServiceSpans(0x0?, {0x4bde08a?}, {0xc001c169bb, 0x5})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:379 +0x7d
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).aggregateMetrics(0x4?, {0x0?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:368 +0xd1
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).ConsumeTraces(0xc000574780, {0x54e0000, 0xc001c13aa0}, {0x0?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:233 +0x34
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export(0xc000b00150, {0x54e0000, 0xc001c13a10}, {0xc000181c80?})
	go.opentelemetry.io/[email protected]/receiver/otlpreceiver/internal/trace/otlp.go:60 +0xd3
go.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export({{0x54af3e0?, 0xc000b00150?}}, {0x54e0000, 0xc001c13a10}, 0xc001c50840)
	go.opentelemetry.io/collector/[email protected]/ptrace/ptraceotlp/traces.go:167 +0xff
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1({0x54e0000, 0xc001c13a10}, {0x4991820?, 0xc001c50840})
	go.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/trace/v1/trace_service.pb.go:216 +0x78
go.opentelemetry.io/collector/config/configgrpc.enhanceWithClientInformation.func1({0x54e0000?, 0xc001c139b0?}, {0x4991820, 0xc001c50840}, 0x0?, 0xc001c50858)
	go.opentelemetry.io/[email protected]/config/configgrpc/configgrpc.go:385 +0x4c
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x54e0000?, 0xc001c139b0?}, {0x4991820?, 0xc001c50840?})
	google.golang.org/[email protected]/server.go:1117 +0x5b
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1({0x54e0000, 0xc001c138c0}, {0x4991820, 0xc001c50840}, 0xc001d54000, 0xc001d3f400)
	go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/[email protected]/interceptor.go:325 +0x664
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x54e0000?, 0xc001c138c0?}, {0x4991820?, 0xc001c50840?})
	google.golang.org/[email protected]/server.go:1120 +0x83
google.golang.org/grpc.chainUnaryInterceptors.func1({0x54e0000, 0xc001c138c0}, {0x4991820, 0xc001c50840}, 0xc001d54000, 0xc001c50858)
	google.golang.org/[email protected]/server.go:1122 +0x12b
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler({0x41e2900?, 0xc00051b200}, {0x54e0000, 0xc001c138c0}, 0xc000323200, 0xc000691bc0)
	go.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/trace/v1/trace_service.pb.go:218 +0x138
google.golang.org/grpc.(*Server).processUnaryRPC(0xc0006d7340, {0x54f4bb8, 0xc000ac8b60}, 0xc00076de60, 0xc000833ec0, 0x7dd94d0, 0x0)
	google.golang.org/[email protected]/server.go:1283 +0xcfd
google.golang.org/grpc.(*Server).handleStream(0xc0006d7340, {0x54f4bb8, 0xc000ac8b60}, 0xc00076de60, 0x0)
	google.golang.org/[email protected]/server.go:1620 +0xa1b
google.golang.org/grpc.(*Server).serveStreams.func1.2()
	google.golang.org/[email protected]/server.go:922 +0x98
created by google.golang.org/grpc.(*Server).serveStreams.func1
	google.golang.org/[email protected]/server.go:920 +0x28a

ramonguiu avatar Jun 25 '22 15:06 ramonguiu

I updated the promscale version to 0.14.0 and didn't have this problem.

janssenlima avatar Oct 07 '22 04:10 janssenlima