exec: [go test -json -tags=integ -timeout 180m ./tests/integration/telemetry/policy -args -istio.test.skipWorkloads=tproxy,vm -istio.test.openshift -istio.test.kube.helm.values=global.platform=openshift -istio.test.istio.enableCNI=true -istio.test.ci=true -istio.test.env=kube -istio.test.kube.deploy=false -istio.test.stableNamespaces=true -istio.test.kube.deployGatewayAPI=false -istio.test.gatewayConformance.maxTimeToConsistency=180s -istio.test.work_dir=/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts] go test pid: 2306540 2025-08-01T20:16:31.986452Z warn unable to resolve TARGET_OUT. Dir /root/mohit/istio/out/linux_ppc64le does not exist 2025-08-01T20:16:31.986527Z warn unable to resolve LOCAL_OUT. Dir /root/mohit/istio/out/linux_ppc64le does not exist 2025-08-01T20:16:32.017823Z info tf === Test Framework Settings === 2025-08-01T20:16:32.017930Z info tf TestID: telemetry_policy RunID: 6ed17a14-0a71-435f-b33e-867862a7ec5d NoCleanup: false BaseDir: /home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts Selector: FailOnDeprecation: false CIMode: true Retries: 0 StableNamespaces: true Revision: SkipWorkloads [tproxy vm] Compatibility: false Revisions: Hub: quay.io/maistra Tag: ibm-p Variant: PullPolicy: Always PullSecret: MaxDumps: 10 HelmRepo: https://istio-release.storage.googleapis.com/charts IPFamilies: [] GatewayConformanceStandardOnly: false GatewayConformanceAllowCRDsMismatch: false 2025-08-01T20:16:32.017937Z info tf =============================== 2025-08-01T20:16:32.019119Z info tf Test run dir: /home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-6ed17a140a7143 2025-08-01T20:16:32.019207Z info tf Test Framework Kubernetes environment Settings: Kubeconfigs: [] LoadBalancerSupported: true MCSControllerEnabled: false ControlPlaneTopology: map[] NetworkTopology: map[] ConfigTopology: map[] 2025-08-01T20:16:32.019220Z info tf Flags istio.test.kube.config and istio.test.kube.topology not specified. 2025-08-01T20:16:32.019226Z info tf Environment variable KUBECONFIG unspecified, defaulting to ~/.kube/config. 2025-08-01T20:16:32.019242Z info tf Using KubeConfigs: [/root/.kube/config]. 2025-08-01T20:16:32.019250Z info tf === BEGIN: Building clusters === 2025-08-01T20:16:32.020759Z info tf Built Cluster: Name: cluster-0 StableName: primary-0 PrimaryCluster: cluster-0 ConfigCluster: cluster-0 Network: HTTPProxy: ProxyKubectlOnly: false Filename: /root/.kube/config 2025-08-01T20:16:32.020768Z info tf === DONE: Building clusters === 2025-08-01T20:16:32.020797Z info tf === BEGIN: Setup: 'telemetry_policy' === 2025-08-01T20:16:32.020825Z info tf === BEGIN: Deploy Istio [Suite=telemetry_policy] === 2025-08-01T20:16:32.020833Z info tf === Istio Component Config === 2025-08-01T20:16:32.020875Z info tf SystemNamespace: istio-system TelemetryNamespace: istio-system DeployIstio: false DeployEastWestGW: true Values: map[global.hub:quay.io/maistra global.imagePullPolicy:Always global.platform:openshift global.tag:ibm-p global.variant:] PrimaryClusterIOPFile: tests/integration/iop-integration-test-defaults.yaml ConfigClusterIOPFile: tests/integration/iop-integration-test-defaults.yaml RemoteClusterIOPFile: tests/integration/iop-remote-integration-test-defaults.yaml BaseIOPFile: tests/integration/base.yaml SkipWaitForValidationWebhook: false DumpKubernetesManifests: false IstiodlessRemotes: true OperatorOptions: map[] EnableCNI: true IngressGatewayServiceName: IngressGatewayServiceNamespace: IngressGatewayIstioLabel: EgressGatewayServiceName: istio-egressgateway EgressGatewayServiceNamespace: istio-system EgressGatewayIstioLabel: egressgateway SharedMeshConfigName: ControlPlaneInstaller: 2025-08-01T20:16:32.020880Z info tf ================================ 2025-08-01T20:16:32.025553Z info tf skipping deployment as specified in the config 2025-08-01T20:16:32.025652Z info tf === SUCCEEDED: Deploy Istio in 4.822113ms [Suite=telemetry_policy]=== 2025-08-01T20:16:32.025660Z info tf === BEGIN: Create namespace istio-echo === 2025-08-01T20:16:32.054057Z info tf === SUCCEEDED: Create namespace istio-echo in 28.373248ms === 2025-08-01T20:16:32.195775Z info tf === BEGIN: Deploy echo instances === 2025-08-01T20:16:32.679564Z info klog Waiting for caches to sync controller=echo 2025-08-01T20:16:32.686921Z info klog Waiting for caches to sync controller=echo 2025-08-01T20:16:32.780373Z info klog Caches are synced controller=echo 2025-08-01T20:16:32.787521Z info klog Caches are synced controller=echo 2025-08-01T20:16:39.091098Z info tf === SUCCEEDED: Deploy echo instances in 6.895318414s === 2025-08-01T20:16:39.091217Z info tf === BEGIN: Create namespace istio-ratelimit === 2025-08-01T20:16:39.128811Z info tf === SUCCEEDED: Create namespace istio-ratelimit in 37.588187ms === 2025-08-01T20:16:39.536793Z info tf Checking pods ready... 2025-08-01T20:16:39.536835Z info tf Checking pods ready... 2025-08-01T20:16:39.548354Z info tf [ 0] redis-65fbbd8f94-glwhw Pending (Pending) 2025-08-01T20:16:39.749042Z info tf Checking pods ready... 2025-08-01T20:16:39.749087Z info tf Checking pods ready... 2025-08-01T20:16:39.761513Z info tf [ 0] redis-65fbbd8f94-glwhw Pending (Pending) 2025-08-01T20:16:40.162684Z info tf Checking pods ready... 2025-08-01T20:16:40.162725Z info tf Checking pods ready... 2025-08-01T20:16:40.169471Z info tf [ 0] redis-65fbbd8f94-glwhw Pending (Pending) 2025-08-01T20:16:40.970113Z info tf Checking pods ready... 2025-08-01T20:16:40.970169Z info tf Checking pods ready... 2025-08-01T20:16:40.974785Z info tf [ 0] redis-65fbbd8f94-glwhw Pending (Pending) 2025-08-01T20:16:42.575716Z info tf Checking pods ready... 2025-08-01T20:16:42.575748Z info tf Checking pods ready... 2025-08-01T20:16:42.580689Z info tf [ 0] redis-65fbbd8f94-glwhw Pending (Pending) 2025-08-01T20:16:45.781496Z info tf Checking pods ready... 2025-08-01T20:16:45.781539Z info tf Checking pods ready... 2025-08-01T20:16:45.786307Z info tf [ 0] redis-65fbbd8f94-glwhw Running (Ready) 2025-08-01T20:16:45.786369Z info tf Checking pods ready... 2025-08-01T20:16:45.786375Z info tf Checking pods ready... 2025-08-01T20:16:45.828312Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:16:46.028997Z info tf Checking pods ready... 2025-08-01T20:16:46.029043Z info tf Checking pods ready... 2025-08-01T20:16:46.035215Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:16:46.436248Z info tf Checking pods ready... 2025-08-01T20:16:46.436337Z info tf Checking pods ready... 2025-08-01T20:16:46.463266Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:16:47.263812Z info tf Checking pods ready... 2025-08-01T20:16:47.263862Z info tf Checking pods ready... 2025-08-01T20:16:47.282033Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:16:48.883096Z info tf Checking pods ready... 2025-08-01T20:16:48.883185Z info tf Checking pods ready... 2025-08-01T20:16:48.888104Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:16:52.088537Z info tf Checking pods ready... 2025-08-01T20:16:52.088578Z info tf Checking pods ready... 2025-08-01T20:16:52.097092Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:16:55.297706Z info tf Checking pods ready... 2025-08-01T20:16:55.297755Z info tf Checking pods ready... 2025-08-01T20:16:55.301407Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:16:58.501506Z info tf Checking pods ready... 2025-08-01T20:16:58.501550Z info tf Checking pods ready... 2025-08-01T20:16:58.506851Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (container not ready: 'ratelimit') 2025-08-01T20:17:01.707050Z info tf Checking pods ready... 2025-08-01T20:17:01.707086Z info tf Checking pods ready... 2025-08-01T20:17:01.711641Z info tf [ 0] ratelimit-586944fcfb-dcr9w Running (Ready) 2025-08-01T20:17:01.780024Z info tf Checking pods ready... 2025-08-01T20:17:01.780082Z info tf Checking pods ready... 2025-08-01T20:17:01.783964Z info tf [ 0] prometheus-858678758b-8kgsp Running (Ready) 2025-08-01T20:17:01.829409Z info tf === DONE: Setup: 'telemetry_policy' (29.80859559s) === 2025-08-01T20:17:01.829446Z info tf === BEGIN: Test Run: 'telemetry_policy' === PASS tests/integration/telemetry/policy.TestRateLimiting (1.25s) PASS tests/integration/telemetry/policy.TestLocalRateLimiting (0.13s) PASS tests/integration/telemetry/policy.TestLocalRouteSpecificRateLimiting (1.48s) PASS tests/integration/telemetry/policy.TestLocalRateLimitingServiceAccount (0.37s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTP_Traffic (5.08s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTP_H2_Traffic (0.03s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTPS_Traffic (5.07s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTPS_Traffic_Conflict (0.05s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTPS_H2_Traffic (0.04s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTPS_H2_Traffic_Conflict (0.04s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTP_Traffic_Egress (4.09s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/HTTP_H2_Traffic_Egress (0.06s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/TCP (0.03s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny/TCP_Conflict (0.03s) PASS tests/integration/telemetry/policy.TestOutboundTrafficPolicy_AllowAny (25.50s) === RUN TestOutboundTrafficPolicy_RegistryOnly 2025-08-01T20:17:30.548073Z info tf === BEGIN: Test: 'telemetry_policy[TestOutboundTrafficPolicy_RegistryOnly]' === 2025-08-01T20:17:30.548118Z info tf === BEGIN: Create namespace app === 2025-08-01T20:17:30.602433Z info tf === SUCCEEDED: Create namespace app in 54.30575ms === 2025-08-01T20:17:30.602493Z info tf === BEGIN: Create namespace service === 2025-08-01T20:17:30.613151Z info tf === SUCCEEDED: Create namespace service in 10.646923ms === helper_test.go:184: failed to apply service entries: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-6ed17a140a7143/TestOutboundTrafficPolicy_RegistryOnly/_test_context/Sidecar.1922670889.yaml] to ns app in cluster cluster-0: apply: patch: sidecars.networking.istio.io "restrict-to-service-entry-namespace" is forbidden: unable to create new content in namespace app because it is being terminated 2025-08-01T20:17:30.664470Z info tf === BEGIN: Deploy echo instances === 2025-08-01T20:17:30.741469Z info klog Waiting for caches to sync controller=echo 2025-08-01T20:17:30.753109Z info klog Waiting for caches to sync controller=echo 2025-08-01T20:17:30.841893Z info klog Caches are synced controller=echo 2025-08-01T20:17:30.854277Z info klog Caches are synced controller=echo 2025-08-01T20:17:30.877557Z info tf === SUCCEEDED: Deploy echo instances in 213.087819ms === helper_test.go:254: failed to apply service entries: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-6ed17a140a7143/TestOutboundTrafficPolicy_RegistryOnly/_test_context/ServiceEntry.3293324767.yaml] to ns service in cluster cluster-0: apply: patch: serviceentries.networking.istio.io "http" is forbidden: unable to create new content in namespace service because it is being terminated helper_test.go:254: failed to apply gateway: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-6ed17a140a7143/TestOutboundTrafficPolicy_RegistryOnly/_test_context/VirtualService-ServiceEntry-Gateway.4264448779.yaml] to ns service in cluster cluster-0: apply: patch: gateways.networking.istio.io "istio-egressgateway" is forbidden: unable to create new content in namespace service because it is being terminated. template: apiVersion: networking.istio.io/v1 kind: Gateway metadata: name: istio-egressgateway spec: selector: istio: egressgateway servers: - port: number: 80 name: http protocol: HTTP hosts: - "some-external-site.com" --- apiVersion: networking.istio.io/v1 kind: VirtualService metadata: name: route-via-egressgateway spec: hosts: - "some-external-site.com" gateways: - istio-egressgateway - mesh http: - match: - gateways: - mesh # from sidecars, route to egress gateway service port: 80 route: - destination: host: istio-egressgateway.istio-system.svc.cluster.local port: number: 80 weight: 100 - match: - gateways: - istio-egressgateway port: 80 route: - destination: host: some-external-site.com headers: request: add: handled-by-egress-gateway: "true" --- apiVersion: networking.istio.io/v1 kind: ServiceEntry metadata: name: ext-service-entry spec: hosts: - "some-external-site.com" location: MESH_EXTERNAL endpoints: - address: destination.app.svc.cluster.local network: external ports: - number: 80 name: http resolution: DNS 2025-08-01T20:17:30.891644Z info tf === DONE (failed): Test: 'telemetry_policy[TestOutboundTrafficPolicy_RegistryOnly] (343.569429ms)' === 2025-08-01T20:17:30.891699Z error tf === Dumping Namespace istio-ratelimit State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:30.891709Z error tf === Dumping Istio Deployment State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:30.891719Z error tf === Dumping Namespace istio-echo State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:31.161589Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/19816b219f-38d7-407b-a064-9516a96face4 2025-08-01T20:17:31.162033Z info tf istioctl ([x internal-debug --all configz]): completed after 0.2701s 2025-08-01T20:17:31.162284Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:31.271437Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/19d958fde6-6662-43d8-8590-9855c46b0d91 2025-08-01T20:17:31.271554Z info tf istioctl ([x internal-debug --all mcsz]): completed after 0.3796s 2025-08-01T20:17:31.272110Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:31.378556Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/192380bce5-2bcd-497c-9c3a-cf0ed0629d05 2025-08-01T20:17:31.378665Z info tf istioctl ([x internal-debug --all clusterz]): completed after 0.4866s 2025-08-01T20:17:31.379233Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:32.059499Z error tf === Dumping Namespace service State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:32.059509Z error tf === Dumping Namespace app State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:32.615316Z warn tf cluster/pod cluster-0/app/client-v1-9cfd95bd4-hfp2k found warming resources (dynamic_warming_clusters) on attempt 1 --- FAIL: TestOutboundTrafficPolicy_RegistryOnly (4.10s) FAIL tests/integration/telemetry/policy.TestOutboundTrafficPolicy_RegistryOnly (4.10s) 2025-08-01T20:17:34.650364Z info tf === FAILED: Test Run: 'telemetry_policy' (exitCode: 1) === 2025-08-01T20:17:34.650420Z info tf === Suite "telemetry_policy" run time: 1m2.629637098s === 2025-08-01T20:17:34.650434Z info tf Wrote trace to /home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/trace.yaml 2025-08-01T20:17:34.651289Z error tf === Dumping Namespace istio-ratelimit State for [suite(telemetry_policy)]... 2025-08-01T20:17:34.651299Z error tf === Dumping Istio Deployment State for [suite(telemetry_policy)]... 2025-08-01T20:17:34.651319Z error tf === Dumping Namespace istio-echo State for [suite(telemetry_policy)]... 2025-08-01T20:17:34.897738Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/19fa2ceb4b-7b04-4b2a-aa27-f04c10e5cab5 2025-08-01T20:17:34.898104Z info tf istioctl ([x internal-debug --all configz]): completed after 0.2466s 2025-08-01T20:17:34.898530Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:34.994732Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/194e208fb8-cb08-489e-9ce1-b94d5ca7efc0 2025-08-01T20:17:34.994823Z info tf istioctl ([x internal-debug --all mcsz]): completed after 0.3433s 2025-08-01T20:17:34.995352Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:35.098566Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/192571fc57-514e-419a-bd92-903984c758d9 2025-08-01T20:17:35.098718Z info tf istioctl ([x internal-debug --all clusterz]): completed after 0.4472s 2025-08-01T20:17:35.099284Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:36.304687Z info tf === BEGIN: Cleanup Istio [Suite=telemetry_policy] === 2025-08-01T20:17:36.304786Z info tf === SUCCEEDED: Cleanup Istio in 84.602µs [Suite=telemetry_policy] === FAIL tests/integration/telemetry/policy DONE 16 tests, 1 failure in 77.843s exec: [go test -json -test.run=^TestOutboundTrafficPolicy_RegistryOnly$ -tags=integ -timeout 180m istio.io/istio/tests/integration/telemetry/policy -args -istio.test.skipWorkloads=tproxy,vm -istio.test.openshift -istio.test.kube.helm.values=global.platform=openshift -istio.test.istio.enableCNI=true -istio.test.ci=true -istio.test.env=kube -istio.test.kube.deploy=false -istio.test.stableNamespaces=true -istio.test.kube.deployGatewayAPI=false -istio.test.gatewayConformance.maxTimeToConsistency=180s -istio.test.work_dir=/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts] go test pid: 2308302 2025-08-01T20:17:47.904799Z warn unable to resolve TARGET_OUT. Dir /root/mohit/istio/out/linux_ppc64le does not exist 2025-08-01T20:17:47.904842Z warn unable to resolve LOCAL_OUT. Dir /root/mohit/istio/out/linux_ppc64le does not exist 2025-08-01T20:17:47.936745Z info tf === Test Framework Settings === 2025-08-01T20:17:47.936810Z info tf TestID: telemetry_policy RunID: d3a5a2f2-f28e-44fc-8f58-cd045d5b9c9b NoCleanup: false BaseDir: /home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts Selector: FailOnDeprecation: false CIMode: true Retries: 0 StableNamespaces: true Revision: SkipWorkloads [tproxy vm] Compatibility: false Revisions: Hub: quay.io/maistra Tag: ibm-p Variant: PullPolicy: Always PullSecret: MaxDumps: 10 HelmRepo: https://istio-release.storage.googleapis.com/charts IPFamilies: [] GatewayConformanceStandardOnly: false GatewayConformanceAllowCRDsMismatch: false 2025-08-01T20:17:47.936815Z info tf =============================== 2025-08-01T20:17:47.936978Z info tf Test run dir: /home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-d3a5a2f2f28e44 2025-08-01T20:17:47.937063Z info tf Test Framework Kubernetes environment Settings: Kubeconfigs: [] LoadBalancerSupported: true MCSControllerEnabled: false ControlPlaneTopology: map[] NetworkTopology: map[] ConfigTopology: map[] 2025-08-01T20:17:47.937072Z info tf Flags istio.test.kube.config and istio.test.kube.topology not specified. 2025-08-01T20:17:47.937077Z info tf Environment variable KUBECONFIG unspecified, defaulting to ~/.kube/config. 2025-08-01T20:17:47.937092Z info tf Using KubeConfigs: [/root/.kube/config]. 2025-08-01T20:17:47.937099Z info tf === BEGIN: Building clusters === 2025-08-01T20:17:47.938325Z info tf Built Cluster: Name: cluster-0 StableName: primary-0 PrimaryCluster: cluster-0 ConfigCluster: cluster-0 Network: HTTPProxy: ProxyKubectlOnly: false Filename: /root/.kube/config 2025-08-01T20:17:47.938331Z info tf === DONE: Building clusters === 2025-08-01T20:17:47.938343Z info tf === BEGIN: Setup: 'telemetry_policy' === 2025-08-01T20:17:47.938362Z info tf === BEGIN: Deploy Istio [Suite=telemetry_policy] === 2025-08-01T20:17:47.938367Z info tf === Istio Component Config === 2025-08-01T20:17:47.938396Z info tf SystemNamespace: istio-system TelemetryNamespace: istio-system DeployIstio: false DeployEastWestGW: true Values: map[global.hub:quay.io/maistra global.imagePullPolicy:Always global.platform:openshift global.tag:ibm-p global.variant:] PrimaryClusterIOPFile: tests/integration/iop-integration-test-defaults.yaml ConfigClusterIOPFile: tests/integration/iop-integration-test-defaults.yaml RemoteClusterIOPFile: tests/integration/iop-remote-integration-test-defaults.yaml BaseIOPFile: tests/integration/base.yaml SkipWaitForValidationWebhook: false DumpKubernetesManifests: false IstiodlessRemotes: true OperatorOptions: map[] EnableCNI: true IngressGatewayServiceName: IngressGatewayServiceNamespace: IngressGatewayIstioLabel: EgressGatewayServiceName: istio-egressgateway EgressGatewayServiceNamespace: istio-system EgressGatewayIstioLabel: egressgateway SharedMeshConfigName: ControlPlaneInstaller: 2025-08-01T20:17:47.938401Z info tf ================================ 2025-08-01T20:17:47.940597Z info tf skipping deployment as specified in the config 2025-08-01T20:17:47.940609Z info tf === SUCCEEDED: Deploy Istio in 2.243995ms [Suite=telemetry_policy]=== 2025-08-01T20:17:47.940615Z info tf === BEGIN: Create namespace istio-echo === 2025-08-01T20:17:47.999281Z info tf === SUCCEEDED: Create namespace istio-echo in 58.647759ms === 2025-08-01T20:17:48.100571Z info tf === BEGIN: Deploy echo instances === 2025-08-01T20:17:48.123019Z error tf === FAILED: Deploy echo instances === 2025-08-01T20:17:48.123072Z error tf failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-d3a5a2f2f28e44/_suite_context/ServiceAccount-Service.1814863651.yaml] to ns istio-echo in cluster cluster-0: apply: patch: serviceaccounts "clt" is forbidden: unable to create new content in namespace istio-echo because it is being terminated 2025-08-01T20:17:48.123107Z error tf === Dumping Namespace istio-echo State for [suite(telemetry_policy)]... 2025-08-01T20:17:48.123116Z error tf === Dumping Istio Deployment State for [suite(telemetry_policy)]... 2025-08-01T20:17:48.456630Z error tf dump failed: hit max attempts 5 attempts (last error: failure running port forward process: building port forwarded: failed retrieving: pods "srv-v1-6fdf4ff5d4-xpwzd" not found in the "istio-echo" namespace) 2025-08-01T20:17:48.540850Z error tf failed to dump ndsz: hit max attempts 5 attempts (last error: failure running port forward process: building port forwarded: failed retrieving: pods "srv-v1-6fdf4ff5d4-xpwzd" not found in the "istio-echo" namespace) 2025-08-01T20:17:48.590440Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:45Z/22d8355cbb-104d-42b4-9651-ae21191b2fa9 2025-08-01T20:17:48.590738Z info tf istioctl ([x internal-debug --all configz]): completed after 0.4674s 2025-08-01T20:17:48.591079Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:48.605015Z error tf Test setup error: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-d3a5a2f2f28e44/_suite_context/ServiceAccount-Service.1814863651.yaml] to ns istio-echo in cluster cluster-0: apply: patch: serviceaccounts "clt" is forbidden: unable to create new content in namespace istio-echo because it is being terminated 2025-08-01T20:17:48.605101Z info tf === FAILED: Setup: 'telemetry_policy' (failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-d3a5a2f2f28e44/_suite_context/ServiceAccount-Service.1814863651.yaml] to ns istio-echo in cluster cluster-0: apply: patch: serviceaccounts "clt" is forbidden: unable to create new content in namespace istio-echo because it is being terminated) === 2025-08-01T20:17:48.605108Z error tf Exiting due to setup failure: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-d3a5a2f2f28e44/_suite_context/ServiceAccount-Service.1814863651.yaml] to ns istio-echo in cluster cluster-0: apply: patch: serviceaccounts "clt" is forbidden: unable to create new content in namespace istio-echo because it is being terminated 2025-08-01T20:17:48.605134Z error tf === Dumping Namespace istio-echo State for [suite(telemetry_policy)]... 2025-08-01T20:17:48.605158Z error tf === Dumping Istio Deployment State for [suite(telemetry_policy)]... 2025-08-01T20:17:48.696817Z info tf === BEGIN: Cleanup Istio [Suite=telemetry_policy] === 2025-08-01T20:17:48.696890Z info tf === SUCCEEDED: Cleanup Istio in 69.107µs [Suite=telemetry_policy] === FAIL tests/integration/telemetry/policy === Failed === FAIL: tests/integration/telemetry/policy TestOutboundTrafficPolicy_RegistryOnly (4.10s) 2025-08-01T20:17:30.548073Z info tf === BEGIN: Test: 'telemetry_policy[TestOutboundTrafficPolicy_RegistryOnly]' === 2025-08-01T20:17:30.548118Z info tf === BEGIN: Create namespace app === 2025-08-01T20:17:30.602433Z info tf === SUCCEEDED: Create namespace app in 54.30575ms === 2025-08-01T20:17:30.602493Z info tf === BEGIN: Create namespace service === 2025-08-01T20:17:30.613151Z info tf === SUCCEEDED: Create namespace service in 10.646923ms === helper_test.go:184: failed to apply service entries: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-6ed17a140a7143/TestOutboundTrafficPolicy_RegistryOnly/_test_context/Sidecar.1922670889.yaml] to ns app in cluster cluster-0: apply: patch: sidecars.networking.istio.io "restrict-to-service-entry-namespace" is forbidden: unable to create new content in namespace app because it is being terminated 2025-08-01T20:17:30.664470Z info tf === BEGIN: Deploy echo instances === 2025-08-01T20:17:30.741469Z info klog Waiting for caches to sync controller=echo 2025-08-01T20:17:30.753109Z info klog Waiting for caches to sync controller=echo 2025-08-01T20:17:30.841893Z info klog Caches are synced controller=echo 2025-08-01T20:17:30.854277Z info klog Caches are synced controller=echo 2025-08-01T20:17:30.877557Z info tf === SUCCEEDED: Deploy echo instances in 213.087819ms === helper_test.go:254: failed to apply service entries: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-6ed17a140a7143/TestOutboundTrafficPolicy_RegistryOnly/_test_context/ServiceEntry.3293324767.yaml] to ns service in cluster cluster-0: apply: patch: serviceentries.networking.istio.io "http" is forbidden: unable to create new content in namespace service because it is being terminated helper_test.go:254: failed to apply gateway: failed applying YAML files [/home/jenkins/workspace/sail/istio-integration-tests-suites/telemetry-policy/artifacts/telemetry-policy-6ed17a140a7143/TestOutboundTrafficPolicy_RegistryOnly/_test_context/VirtualService-ServiceEntry-Gateway.4264448779.yaml] to ns service in cluster cluster-0: apply: patch: gateways.networking.istio.io "istio-egressgateway" is forbidden: unable to create new content in namespace service because it is being terminated. template: apiVersion: networking.istio.io/v1 kind: Gateway metadata: name: istio-egressgateway spec: selector: istio: egressgateway servers: - port: number: 80 name: http protocol: HTTP hosts: - "some-external-site.com" --- apiVersion: networking.istio.io/v1 kind: VirtualService metadata: name: route-via-egressgateway spec: hosts: - "some-external-site.com" gateways: - istio-egressgateway - mesh http: - match: - gateways: - mesh # from sidecars, route to egress gateway service port: 80 route: - destination: host: istio-egressgateway.istio-system.svc.cluster.local port: number: 80 weight: 100 - match: - gateways: - istio-egressgateway port: 80 route: - destination: host: some-external-site.com headers: request: add: handled-by-egress-gateway: "true" --- apiVersion: networking.istio.io/v1 kind: ServiceEntry metadata: name: ext-service-entry spec: hosts: - "some-external-site.com" location: MESH_EXTERNAL endpoints: - address: destination.app.svc.cluster.local network: external ports: - number: 80 name: http resolution: DNS 2025-08-01T20:17:30.891644Z info tf === DONE (failed): Test: 'telemetry_policy[TestOutboundTrafficPolicy_RegistryOnly] (343.569429ms)' === 2025-08-01T20:17:30.891699Z error tf === Dumping Namespace istio-ratelimit State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:30.891709Z error tf === Dumping Istio Deployment State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:30.891719Z error tf === Dumping Namespace istio-echo State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:31.161589Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/19816b219f-38d7-407b-a064-9516a96face4 2025-08-01T20:17:31.162033Z info tf istioctl ([x internal-debug --all configz]): completed after 0.2701s 2025-08-01T20:17:31.162284Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:31.271437Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/19d958fde6-6662-43d8-8590-9855c46b0d91 2025-08-01T20:17:31.271554Z info tf istioctl ([x internal-debug --all mcsz]): completed after 0.3796s 2025-08-01T20:17:31.272110Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:31.378556Z info adsc Received type=istio.io/debug count=1 nonce=2025-08-01T20:17:30Z/192380bce5-2bcd-497c-9c3a-cf0ed0629d05 2025-08-01T20:17:31.378665Z info tf istioctl ([x internal-debug --all clusterz]): completed after 0.4866s 2025-08-01T20:17:31.379233Z info adsc connection closed with err: rpc error: code = Unavailable desc = error reading from server: EOF 2025-08-01T20:17:32.059499Z error tf === Dumping Namespace service State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:32.059509Z error tf === Dumping Namespace app State for TestOutboundTrafficPolicy_RegistryOnly... 2025-08-01T20:17:32.615316Z warn tf cluster/pod cluster-0/app/client-v1-9cfd95bd4-hfp2k found warming resources (dynamic_warming_clusters) on attempt 1 DONE 2 runs, 16 tests, 1 failure in 90.243s exec: go version