-
Bug
-
Resolution: Unresolved
-
Undefined
-
None
-
rhosdt-3.7
-
None
-
None
-
Quality / Stability / Reliability
-
1
-
False
-
-
False
-
-
Version of components:
Tempo Operator build off the latest upstream branch.
Description of the issue:
When viewing Trace Timeline > Trace JSON from the Tempo Query frontend, the request fails with error:
{"data":null,"total":0,"limit":0,"offset":0,"errors":[{"code":500,"msg":"grpc stream error: rpc error: code = Unavailable desc = error reading from server: EOF"}]}
The Tempo Query frontend pod fails with the below error:
ts=1751879684.8372045 level=info msg="FindTraces: fetching traces" traceids=0 ts=1751879690.3362782 level=info msg="FindTraces: fetching traces" traceids=10 panic: invalid Go type model.TraceID for field jaeger.storage.v1.GetTraceRequest.trace_id goroutine 4005 [running]: google.golang.org/protobuf/internal/impl.newSingularConverter({0xf8f408, 0xdba860}, {0xf8eb08, 0xc0001e2388}) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/convert.go:142 +0xd47 google.golang.org/protobuf/internal/impl.NewConverter({0xf8f408, 0xdba860}, {0xf8eb08, 0xc0001e2388}) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/convert.go:60 +0x8f google.golang.org/protobuf/internal/impl.fieldInfoForScalar({0xf8eb08, 0xc0001e2388}, {{0xd8ff77, 0x7}, {0x0, 0x0}, {0xf8f408, 0xdba860}, {0xd8ff80, 0x88}, ...}, ...) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/message_reflect_field.go:268 +0x1d7 google.golang.org/protobuf/internal/impl.(*MessageInfo).makeKnownFieldsFunc(0xc000302300, {0x58, {0xf8f408, 0xccd180}, 0x40, {0xf8f408, 0xcbc560}, 0xffffffffffffffff, {0x0, 0x0}, ...}) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/message_reflect.go:78 +0x74a google.golang.org/protobuf/internal/impl.(*MessageInfo).makeReflectFuncs(0xc000302300, {0xf8f408, 0xdd7c20}, {0x58, {0xf8f408, 0xccd180}, 0x40, {0xf8f408, 0xcbc560}, 0xffffffffffffffff, ...}) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/message_reflect.go:42 +0x78 google.golang.org/protobuf/internal/impl.(*MessageInfo).initOnce(0xc000302300) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/message.go:92 +0x1d0 google.golang.org/protobuf/internal/impl.(*MessageInfo).init(...) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/message.go:71 google.golang.org/protobuf/internal/impl.(*messageReflectWrapper).ProtoMethods(0xf75340?) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/internal/impl/message_reflect_gen.go:162 +0x25 google.golang.org/protobuf/proto.protoMethods(...) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/proto/proto_methods.go:19 google.golang.org/protobuf/proto.UnmarshalOptions.unmarshal({{}, 0x1, 0x1, 0x0, {0xf76ad8, 0xc0000e32c0}, 0x2710, 0x0}, {0xc000161985, 0x2c, ...}, ...) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/proto/decode.go:100 +0xe2 google.golang.org/protobuf/proto.Unmarshal({0xc000161985, 0x2c, 0x2c}, {0xf75340?, 0xc0004243c0?}) /home/runner/work/tempo/tempo/vendor/google.golang.org/protobuf/proto/decode.go:62 +0x6e google.golang.org/grpc/encoding/proto.(*codecV2).Unmarshal(0x101?, {0xc0004243b0?, 0x1?, 0x1?}, {0xdf7dc0?, 0xc0004125a0?}) /home/runner/work/tempo/tempo/vendor/google.golang.org/grpc/encoding/proto/proto.go:80 +0xfa google.golang.org/grpc.recv(0x260918?, {0x7fae5af153e8, 0x163f920}, {0xf731c0?, 0xc000412420?}, {0x0?, 0x0?}, {0xdf7dc0, 0xc0004125a0}, 0x400000, ...) /home/runner/work/tempo/tempo/vendor/google.golang.org/grpc/rpc_util.go:911 +0x11f google.golang.org/grpc.(*serverStream).RecvMsg(0xc0003e2a80, {0xdf7dc0, 0xc0004125a0}) /home/runner/work/tempo/tempo/vendor/google.golang.org/grpc/stream.go:1760 +0x192 github.com/jaegertracing/jaeger/proto-gen/storage_v1._SpanReaderPlugin_GetTrace_Handler({0xde8f80, 0xc0002aa000}, {0xf81088, 0xc0003e2a80}) /home/runner/work/tempo/tempo/vendor/github.com/jaegertracing/jaeger/proto-gen/storage_v1/storage.pb.go:1473 +0x57 google.golang.org/grpc.(*Server).processStreamingRPC(0xc00015e600, {0xf7d418, 0xc000179710}, 0xc000412420, 0xc000178b40, 0x1608da0, 0x0) /home/runner/work/tempo/tempo/vendor/google.golang.org/grpc/server.go:1695 +0x1252 google.golang.org/grpc.(*Server).handleStream(0xc00015e600, {0xf7d650, 0xc00001c000}, 0xc000412420) /home/runner/work/tempo/tempo/vendor/google.golang.org/grpc/server.go:1819 +0xb47 google.golang.org/grpc.(*Server).serveStreams.func2.1() /home/runner/work/tempo/tempo/vendor/google.golang.org/grpc/server.go:1035 +0x7f created by google.golang.org/grpc.(*Server).serveStreams.func2 in goroutine 3971 /home/runner/work/tempo/tempo/vendor/google.golang.org/grpc/server.go:1046 +0x11d
Jaeger Query container shows the following error in the logs:
{"level":"info","ts":1751881204.0381281,"caller":"grpc@v1.71.1/clientconn.go:1224","msg":"[core] [Channel #1 SubChannel #10]Subchannel Connectivity change to CONNECTING"} {"level":"info","ts":1751881204.0381706,"caller":"grpc@v1.71.1/clientconn.go:1344","msg":"[core] [Channel #1 SubChannel #10]Subchannel picks a new address \"localhost:7777\" to connect"} {"level":"info","ts":1751881204.0385113,"caller":"pickfirst/pickfirst.go:184","msg":"[pick-first-lb] [pick-first-lb 0xc00067e870] Received SubConn state update: 0xc00053e410, {ConnectivityState:CONNECTING ConnectionError:<nil> connectedAddress:{Addr: ServerName: Attributes:<nil> BalancerAttributes:<nil> Metadata:<nil>}}"} {"level":"info","ts":1751881204.0385873,"caller":"grpc@v1.71.1/clientconn.go:563","msg":"[core] [Channel #1]Channel Connectivity change to CONNECTING"} {"level":"info","ts":1751881204.0387719,"caller":"grpc@v1.71.1/clientconn.go:1224","msg":"[core] [Channel #1 SubChannel #10]Subchannel Connectivity change to READY"} {"level":"info","ts":1751881204.038799,"caller":"pickfirst/pickfirst.go:184","msg":"[pick-first-lb] [pick-first-lb 0xc00067e870] Received SubConn state update: 0xc00053e410, {ConnectivityState:READY ConnectionError:<nil> connectedAddress:{Addr:localhost:7777 ServerName:localhost:7777 Attributes:<nil> BalancerAttributes:<nil> Metadata:<nil>}}"} {"level":"info","ts":1751881204.0388103,"caller":"grpc@v1.71.1/clientconn.go:563","msg":"[core] [Channel #1]Channel Connectivity change to READY"} {"level":"info","ts":1751881204.0425653,"caller":"transport/http2_client.go:1624","msg":"[transport] [client-transport 0xc0004bc008] Closing: connection error: desc = \"error reading from server: EOF\""} {"level":"info","ts":1751881204.0425954,"caller":"grpc@v1.71.1/clientconn.go:1224","msg":"[core] [Channel #1 SubChannel #10]Subchannel Connectivity change to IDLE"} {"level":"info","ts":1751881204.042613,"caller":"transport/controlbuf.go:580","msg":"[transport] [client-transport 0xc0004bc008] loopyWriter exiting with error: connection error: desc = \"error reading from server: EOF\""} {"level":"error","ts":1751881204.0427136,"caller":"app/http_handler.go:517","msg":"HTTP handler, Internal Server Error","error":"grpc stream error: rpc error: code = Unavailable desc = error reading from server: EOF","stacktrace":"github.com/jaegertracing/jaeger/cmd/query/app.(*APIHandler).handleError\n\tgithub.com/jaegertracing/jaeger/cmd/query/app/http_handler.go:517\ngithub.com/jaegertracing/jaeger/cmd/query/app.(*APIHandler).getTrace\n\tgithub.com/jaegertracing/jaeger/cmd/query/app/http_handler.go:475\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.WithRouteTag.func1\n\tgo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp@v0.60.0/handler.go:231\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngithub.com/jaegertracing/jaeger/cmd/query/app.(*APIHandler).handleFunc.spanNameHandler.func1\n\tgithub.com/jaegertracing/jaeger/cmd/query/app/http_handler.go:554\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngithub.com/gorilla/mux.(*Router).ServeHTTP\n\tgithub.com/gorilla/mux@v1.8.1/mux.go:212\ngithub.com/jaegertracing/jaeger/cmd/query/app.initRouter.PropagationHandler.func4\n\tgithub.com/jaegertracing/jaeger/internal/bearertoken/http.go:41\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngithub.com/jaegertracing/jaeger/cmd/query/app.initRouter.ExtractTenantHTTPHandler.func5\n\tgithub.com/jaegertracing/jaeger/internal/tenancy/http.go:36\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngithub.com/jaegertracing/jaeger/cmd/query/app.initRouter.traceResponseHandler.func6\n\tgithub.com/jaegertracing/jaeger/cmd/query/app/trace_response_handler.go:23\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngithub.com/gorilla/handlers.recoveryHandler.ServeHTTP\n\tgithub.com/gorilla/handlers@v1.5.2/recovery.go:80\ngo.opentelemetry.io/collector/config/confighttp.(*decompressor).ServeHTTP\n\tgo.opentelemetry.io/collector/config/confighttp@v0.123.0/compression.go:183\ngo.opentelemetry.io/collector/config/confighttp.(*ServerConfig).ToServer.maxRequestBodySizeInterceptor.func2\n\tgo.opentelemetry.io/collector/config/confighttp@v0.123.0/confighttp.go:562\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngo.opentelemetry.io/collector/config/confighttp.(*ServerConfig).ToServer.responseHeadersHandler.func5\n\tgo.opentelemetry.io/collector/config/confighttp@v0.123.0/confighttp.go:509\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*middleware).serveHTTP\n\tgo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp@v0.60.0/handler.go:179\ngo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.NewMiddleware.func1.1\n\tgo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp@v0.60.0/handler.go:67\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2294\ngo.opentelemetry.io/collector/config/confighttp.(*clientInfoHandler).ServeHTTP\n\tgo.opentelemetry.io/collector/config/confighttp@v0.123.0/clientinfohandler.go:26\nnet/http.serverHandler.ServeHTTP\n\tnet/http/server.go:3301\nnet/http.(*conn).serve\n\tnet/http/server.go:2102"} {"level":"info","ts":1751881204.0429068,"caller":"pickfirst/pickfirst.go:184","msg":"[pick-first-lb] [pick-first-lb 0xc00067e870] Received SubConn state update: 0xc00053e410, {ConnectivityState:IDLE ConnectionError:<nil> connectedAddress:{Addr: ServerName: Attributes:<nil> BalancerAttributes:<nil> Metadata:<nil>}}"} {"level":"info","ts":1751881204.042926,"caller":"grpc@v1.71.1/clientconn.go:563","msg":"[core] [Channel #1]Channel Connectivity change to IDLE"}
Steps to reproduce the issue:
- Install Tempo Operator built off the latest upstream branch.
- Install OpenTelemetry Operator.
- Run the multitenancy and monolithic-multitenancy-openshift tests with --skip-delete.
- Check the Trace JSON output from Tempo Query frontend route > select any one of the Trace > Click on Trace Timeline > Trace JSON.
- The request will fail with the error mentioned in the description and the Tempo Query frontend pod is restarted.
Additional notes:
This issue is observed in upstream only and does not occur in prod RHOSDT 3.6
- is duplicated by
-
TRACING-5688 Internal Server Error in Jaeger UI Single trace URL
-
- New
-
- links to