Uploaded image for project: 'OpenShift Bugs'
  1. OpenShift Bugs
  2. OCPBUGS-43744

ipsec pod crashes when enabling ipsec in the hosted cluster

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Major Major
    • None
    • 4.18
    • HyperShift
    • None
    • Proposed
    • False
    • Hide

      None

      Show
      None

      Description of problem:

        when ipsec is enabled in the hosted cluster, the ipsec pod is keep crashing

      Version-Release number of selected component (if applicable):

      4.18  (in my test, the payload is 4.18.0-0.nightly-2024-10-23-003937)

      How reproducible:

          100%

      Steps to Reproduce:

      1. create a hosted cluster with payload 4.18.0-0.nightly-2024-10-23-003937     
      2. enable the ipsec by oc patch:
      
      $ oc --kubeconfig=hosted patch networks.operator.openshift.io cluster --type=merge -p '{"spec":{"defaultNetwork":{"ovnKubernetesConfig":{"ipsecConfig":{"mode":  "Full" }}}}}'
      
      3. check ipsec pod
      
      $ och get pod -n openshift-ovn-kubernetes
      NAME                            READY   STATUS             RESTARTS         AGE
      ovn-ipsec-containerized-xsk5d   0/1     CrashLoopBackOff   15 (3m36s ago)   71m
      ovnkube-node-ngmhh              8/8     Running            0                71m

      Actual results:

          pod could not work 

      Expected results:

          pod is ready without any crash

      Additional info:

       1. describe the crash ipsec pod

      $ oc --kubeconfig=hosted describe pod ovn-ipsec-containerized-xsk5d  -n openshift-ovn-kubernetes
      ...
        Normal   Started    67m (x3 over 73m)      kubelet            Started container ovn-ipsec
        Normal   Killing    65m (x3 over 71m)      kubelet            Container ovn-ipsec failed liveness probe, will be restarted
        Normal   Created    64m (x4 over 73m)      kubelet            Created container ovn-ipsec
        Warning  Unhealthy  18m (x41 over 73m)     kubelet            Liveness probe failed: no ipsec traffic configured
        Normal   Pulled     8m51s (x16 over 73m)   kubelet            Container image "quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2552c194feb83dcf2eac3253196caed184786669a49949270a2d4d0ed906510c" already present on machine
        Warning  BackOff    3m49s (x109 over 49m)  kubelet            Back-off restarting failed container ovn-ipsec in pod ovn-ipsec-containerized-xsk5d_openshift-ovn-kubernetes(d5a97c94-331a-43a3-9c35-801cfe9a3fa1)

      2. logs:

      $ oc --kubeconfig=hosted logs ovn-ipsec-containerized-xsk5d  -n openshift-ovn-kubernetes
      Defaulted container "ovn-ipsec" out of: ovn-ipsec, ovn-keys (init)
      + trap cleanup SIGTERM
      + counter=0
      + '[' -f /etc/cni/net.d/10-ovn-kubernetes.conf ']'
      + echo 'ovnkube-node has configured node.'
      ovnkube-node has configured node.
      + ip x s flush
      + ip x p flush
      + ulimit -n 1024
      + /usr/libexec/ipsec/addconn --config /etc/ipsec.conf --checkconfig
      + /usr/libexec/ipsec/_stackmanager start
      + /usr/sbin/ipsec --checknss
      Initializing NSS database
       
      + /usr/libexec/ipsec/pluto --leak-detective --config /etc/ipsec.conf --logfile /var/log/openvswitch/libreswan.log
      + /usr/libexec/platform-python /usr/share/openvswitch/scripts/ovs-monitor-ipsec --pidfile=/var/run/openvswitch/ovs-monitor-ipsec.pid --ike-daemon=libreswan --no-restart-ike-daemon --ipsec-d /var/lib/ipsec/nss --log-file --monitor unix:/var/run/openvswitch/db.sock
      2024-10-23T13:46:58Z |  7  | reconnect | INFO | unix:/var/run/openvswitch/db.sock: connecting...
      2024-10-23T13:46:58Z |  10 | reconnect | INFO | unix:/var/run/openvswitch/db.sock: connected
      2024-10-23T13:46:58Z |  15 | ovs-monitor-ipsec | INFO | Refreshing LibreSwan configuration
      002 loading secrets from "/etc/ipsec.secrets"
      

       

              cewong@redhat.com Cesar Wong
              rhn-support-heli He Liu
              He Liu He Liu
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated: