Uploaded image for project: 'OpenShift Bugs'
  1. OpenShift Bugs
  2. OCPBUGS-77457

Gateway Pod can not pull wasm plugin behind an enterprise proxy

    • None
    • False
    • Hide

      None

      Show
      None
    • None
    • None
    • None
    • Dev
    • None
    • None
    • Rejected
    • None
    • None
    • None
    • None
    • None
    • None
    • None
    • None

      Description of problem:

      We have installed RHOAI 3.2 and Connectivity Link, along with their dependencies, on a customer-premises environment behind an enterprise HTTP proxy. Following the RHOAI documentation, we defined the OpenShift AI Inference Gateway within the openshift-ingress project. However, the Inference Gateway returns an RBAC Error (HTTP 403) for all requests. We have identified that the root cause is the gateway's inability to pull the required Wasm plugin due to the proxy restrictions. It is a customer PoC environment, and cluster wide egress policy is already configured and there is no problem of pulling images in other pods"

      Version-Release number of selected component (if applicable):

          

      How reproducible:

          Always

      Steps to Reproduce:

          1. Install OpenShift AI 4.2, Connectivity Link, Metal LB (for external IP)
          2. Configure openshift ai inference gateway
          3. Configure llm-d
          4. Deploy an LLMInferenceService 
          5. After LLMInferenceService is ready, send request
          6. See HTTP 403, RBAC Error in response
          7. Check Gateway logs to see it can not pull wasm plugin
      
          

      Actual results:

          The Gateway gives following error: 
      [2026-02-23T13:13:58.925Z] "GET /llm-test/gemma-3-12b-llmd HTTP/1.1" 403 - rbac_access_denied_matched_policy[none] - "-" 0 19 0 - "10.223.3.115" "curl/7.76.1" "82d4eef2-9125-4529-be96-0509ba5bb41e" "openshift-ai-inference-openshift-ai-inference.openshift-ingress.svc.cluster.local" "-" outbound|8000||gemma-3-12b-llmd-kserve-workload-svc.llm-test.svc.cluster.local - 10.220.4.208:80 10.223.3.115:38586 - llm-test.gemma-3-12b-llmd-kserve-route.2 2026-02-23T13:14:21.586710Z error wasm error in converting the wasm config to local: cannot fetch Wasm module oci://registry.access.redhat.com/rhcl-1/wasm-shim-rhel9@sha256:aa524e9278976aa3ef0f2d3aeb981e28d2ea6ed6fc5fede05a5dd33db0bb3de2: could not fetch Wasm OCI image: could not fetch manifest: Get "https://registry.access.redhat.com/v2/": context deadline exceeded. applying deny RBAC filter

       

      Expected results:

      The Gateway must download the wasm-shim image for authentication and authorization, even if those features are disabled for a specific LLMInferenceService via Proxy. OpenShift platform components are supposed to respect the cluster-wide egress proxy configuration (proxies.config.openshift.io/cluster).  The Ingress Operator ought to configure the Istio control-plane so that we don't need to set the environment variables ourselves. 

      Additional info:

      As a workaround, i injected the Proxy variables to the Gateway manually and I saw the issue is resolved.
      I injected the variables as follows: 
      First I created a configmap: https://github.com/erkerc/rhoai-distributed-inference/blob/main/llm-d/kubernetes/gateway-proxy-cm.yaml
      Then configured the gateway to inject that variables: https://github.com/erkerc/rhoai-distributed-inference/blob/main/llm-d/kubernetes/gateway-proxy.yaml
      
      apiVersion: gateway.networking.k8s.io/v1
      kind: Gateway
      metadata:
        name: openshift-ai-inference
        namespace: openshift-ingress
      spec:
        gatewayClassName: openshift-default
        infrastructure:
          parametersRef:
            group: ""
            kind: ConfigMap
            name: gateway-proxy-env
        listeners:
          - name: http
            port: 80
            protocol: HTTP
            allowedRoutes:
              namespaces:
                from: All

       

       

              nid-team-bot NID Team Bot
              eercan@redhat.com Erkan Ercan
              Hongan Li Hongan Li
              None
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

                Created:
                Updated: