-
Bug
-
Resolution: Done
-
Undefined
-
None
-
1
-
False
-
None
-
False
-
-
-
Tracing Sprint # 241, Tracing Sprint # 242
Description of the problem:
Cannot access the Jaeger console when the Distributed Tracing is running on a cluster which has HTTP_PROXY enabled. Accessing the route throws 504 gateway timeout error.
Version of components:
OCP Server Version: 4.13.0-0.nightly-2023-05-25-001936
jaeger-operator.v1.42.0-5
opentelemetry-operator.v0.74.0-5
How reproducible:
Always
Steps to reproduce the issue:
*Install the Distributed Tracing Platform, Distributed Tracing Collector and Elasticsearch operators.
*Clone the jaeger-scripts repo.
git clone git@gitlab.cee.redhat.com:jkandasa/jaeger-scripts.git
*Go to “jaeger-scripts/scripts/otel/simple_test”
*Run “./01_setup.sh” and wait until Jaeger and the OTEL collector is deployed
*Run “./02_query_app.sh” to report some traces
*Check the traces are available in Jaeger: xdg-open "https://$(oc get route jaeger --output=jsonpath='{.spec.host}' -n observability)"
https://jaeger-observability.apps.ikanse-35.qe.devcluster.openshift.com
Additional details:
Proxy configuration on the cluster.
$ oc get proxy/cluster -o yaml apiVersion: config.openshift.io/v1 kind: Proxy metadata: creationTimestamp: "2023-05-25T06:35:56Z" generation: 1 name: cluster resourceVersion: "540" uid: 3d333f93-19f9-4bc3-9f6d-316d50d2fd7c spec: httpProxy: http://USERNAME:PASSWORD@ec2-18-224-66-176.us-east-2.compute.amazonaws.com:3128 httpsProxy: http://USERNAME:PASSWORD@ec2-18-224-66-176.us-east-2.compute.amazonaws.com:3128 noProxy: test.no-proxy.com trustedCA: name: "" status: httpProxy: http://USERNAME:PASSWORD@ec2-18-224-66-176.us-east-2.compute.amazonaws.com:3128 httpsProxy: http://USERNAME:PASSWORD@ec2-18-224-66-176.us-east-2.compute.amazonaws.com:3128 noProxy: .cluster.local,.svc,.us-east-2.compute.internal,10.0.0.0/16,10.128.0.0/14,127.0.0.1,169.254.169.254,172.30.0.0/16,api-int.ikanse-35.qe.devcluster.openshift.com,localhost,test.no-proxy.com
Errors observed in the oauth-proxy container.
$ oc logs jaeger-574f447bdd-b42t5 -c oauth-proxy 2023/05/25 08:29:32 provider.go:129: Defaulting client-id to system:serviceaccount:observability:jaeger-ui-proxy 2023/05/25 08:29:32 provider.go:134: Defaulting client-secret to service account token /var/run/secrets/kubernetes.io/serviceaccount/token 2023/05/25 08:29:32 oauthproxy.go:203: mapping path "/" => upstream "http://localhost:16686/" 2023/05/25 08:29:32 oauthproxy.go:230: OAuthProxy configured for Client ID: system:serviceaccount:observability:jaeger-ui-proxy 2023/05/25 08:29:32 oauthproxy.go:240: Cookie settings: name:_oauth_proxy secure(https):true httponly:true expiry:168h0m0s domain:<default> samesite: refresh:disabled 2023/05/25 08:29:32 http.go:61: HTTP: listening on 127.0.0.1:4180 I0525 08:29:32.732820 1 dynamic_serving_content.go:130] Starting serving::/etc/tls/private/tls.crt::/etc/tls/private/tls.key 2023/05/25 08:29:32 http.go:107: HTTPS: listening on [::]:8443 2023/05/25 08:32:11 provider.go:631: Performing OAuth discovery against https://172.30.0.1/.well-known/oauth-authorization-server 2023/05/25 08:32:11 provider.go:671: 200 GET https://172.30.0.1/.well-known/oauth-authorization-server { "issuer": "https://oauth-openshift.apps.ikanse-35.qe.devcluster.openshift.com", "authorization_endpoint": "https://oauth-openshift.apps.ikanse-35.qe.devcluster.openshift.com/oauth/authorize", "token_endpoint": "https://oauth-openshift.apps.ikanse-35.qe.devcluster.openshift.com/oauth/token", "scopes_supported": [ "user:check-access", "user:full", "user:info", "user:list-projects", "user:list-scoped-projects" ], "response_types_supported": [ "code", "token" ], "grant_types_supported": [ "authorization_code", "implicit" ], "code_challenge_methods_supported": [ "plain", "S256" ] } 2023/05/25 08:32:50 provider.go:631: Performing OAuth discovery against https://172.30.0.1/.well-known/oauth-authorization-server 2023/05/25 08:32:50 provider.go:671: 200 GET https://172.30.0.1/.well-known/oauth-authorization-server { "issuer": "https://oauth-openshift.apps.ikanse-35.qe.devcluster.openshift.com", "authorization_endpoint": "https://oauth-openshift.apps.ikanse-35.qe.devcluster.openshift.com/oauth/authorize", "token_endpoint": "https://oauth-openshift.apps.ikanse-35.qe.devcluster.openshift.com/oauth/token", "scopes_supported": [ "user:check-access", "user:full", "user:info", "user:list-projects", "user:list-scoped-projects" ], "response_types_supported": [ "code", "token" ], "grant_types_supported": [ "authorization_code", "implicit" ], "code_challenge_methods_supported": [ "plain", "S256" ] } 2023/05/25 08:33:50 oauthproxy.go:654: error redeeming code (client:10.131.0.2:56382): Post "https://oauth-openshift.apps.ikanse-35.qe.devcluster.openshift.com/oauth/token": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 2023/05/25 08:33:50 oauthproxy.go:445: ErrorPage 500 Internal Error Internal Error