Uploaded image for project: 'OpenShift API for Data Protection'
  1. OpenShift API for Data Protection
  2. OADP-1932

Restic restore failed ,exit on PV claim: "Failed to get Volumesnapshot" , "error getting persistent volume claim for volume"

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • OADP 1.2.0
    • OADP 1.2.0
    • None
    • None
    • False
    • Hide

      None

      Show
      None
    • False
    • ToDo
    • 0
    • 0
    • Very Likely
    • 0
    • None
    • Unset
    • Unknown
    • No

      Description of problem:

      running  restore using Restic  on single ns with 1000 pods , PVC size 32MB 
      on ceph-rbd 

      the restore failed on the following 2 different errors 

       time="2023-05-11T20:40:52Z" level=error msg="unable to successfully complete pod volume restores of pod's volumes" error="error getting persistent volume claim for volume: pe
      rsistentvolumeclaims \"pvc-busybox-perf-single-ns-1000-pods-105\" not found" logSource="/remote-source/velero/app/pkg/restore/restore.go:1699" restore=openshift-adp/restore-r
      estic-busybox-perf-single-1000-pods-rbd-rbd
      time="2023-05-11T20:40:55Z" level=error msg="unable to successfully complete pod volume restores of pod's volumes" error="error getting persistent volume claim for volume: pe
      rsistentvolumeclaims \"pvc-busybox-perf-single-ns-1000-pods-106\" not found" logSource="/remote-source/velero/app/pkg/restore/restore.go:1699" restore=openshift-adp/restore-r
      estic-busybox-perf-single-1000-pods-rbd-rbd
      time="2023-05-11T20:40:58Z" level=error msg="unable to successfully complete pod volume restores of pod's volumes" error="error getting persistent volume claim for volume: pe
      rsistentvolumeclaims \"pvc-busybox-perf-single-ns-1000-pods-107\" not found" logSource="/remote-source/velero/app/pkg/restore/restore.go:1699" restore=openshift-adp/restore-r

       

      time="2023-05-11T21:34:02Z" level=error msg="Namespace busybox-perf-single-ns-1000-pods, resource restore error: error preparing persistentvolumeclaims/busybox-perf-single-ns-1000-pods/pvc-busybox-perf-single-ns-1000-pods-997: rpc error: code = Unknown desc = Failed to get Volumesnapshot busybox-perf-single-ns-1000-pods/velero-pvc-busybox-perf-single-ns-1000-pods-997-8hxmj to restore PVC busybox-perf-single-ns-1000-pods/pvc-busybox-perf-single-ns-1000-pods-997: volumesnapshots.snapshot.storage.k8s.io \"velero-pvc-busybox-perf-single-ns-1000-pods-997-8hxmj\" not found" logSource="/remote-source/velero/app/pkg/controller/restore_controller.go:498" restore=openshift-adp/restore-restic-busybox-perf-single-1000-pods-rbd-rbd
      time="2023-05-11T21:34:02Z" level=error msg="Namespace busybox-perf-single-ns-1000-pods, resource restore error: error preparing persistentvolumeclaims/busybox-perf-single-ns-1000-pods/pvc-busybox-perf-single-ns-1000-pods-998: rpc error: code = Unknown desc = Failed to get Volumesnapshot busybox-perf-single-ns-1000-pods/velero-pvc-busybox-perf-single-ns-1000-pods-998-vrns8 to restore PVC busybox-perf-single-ns-1000-pods/pvc-busybox-perf-single-ns-1000-pods-998: volumesnapshots.snapshot.storage.k8s.io \"velero-pvc-busybox-perf-single-ns-1000-pods-998-vrns8\" not found" logSource="/remote-source/velero/app/pkg/controller/restore_controller.go:498" restore=openshift-adp/restore-restic-busybox-perf-single-1000-pods-rbd-rbd
      time="2023-05-11T21:34:02Z" level=error msg="Namespace busybox-perf-single-ns-1000-pods, resource restore error: error preparing persistentvolumeclaims/busybox-perf-single-ns-1000-pods/pvc-busybox-perf-single-ns-1000-pods-999: rpc error: code = Unknown desc = Failed to get Volumesnapshot busybox-perf-single-ns-1000-pods/velero-pvc-busybox-perf-single-ns-1000-pods-999-stpqw to restore PVC busybox-perf-single-ns-1000-pods/pvc-busybox-perf-single-ns-1000-pods-999: volumesnapshots.snapshot.storage.k8s.io \"velero-pvc-busybox-perf-single-ns-1000-pods-999-stpqw\" not found" logSource="/remote-source/velero/app/pkg/controller/restore_controller.go:498" restore=openshift-adp/restore-restic-busybox-perf-single-1000-pods-rbd-rbd
       

       

       

      Version-Release number of selected component (if applicable):

      OCP: 4.11.7

      ODF: 4.11.7
      OADP: 1.2.0-69

       

      Steps to Reproduce:
      1.create a backup-restic-busybox-perf-single-1000-pods-rbd 
      2.after backup completed delete the ns 
      3.run restore restore-restic-busybox-perf-single-1000-pods-rbd-rbd with the same backup CR :  backup-restic-busybox-perf-single-1000-pods-rbd

      Actual results:

       "status": {
              "completionTimestamp": "2023-05-11T21:34:02Z",
              "errors": 2000,
              "phase": "PartiallyFailed",
              "progress": {
                  "itemsRestored": 9025,
                  "totalItems": 9025
              },
              "startTimestamp": "2023-05-11T20:28:24Z",
              "warnings": 11
       

      Expected results:

       

      Additional info:
      this bug may be related to this one :
      https://issues.redhat.com/browse/OADP-1844

        1. velero-5848b67986-rgbfx.log
          22.49 MB
        2. openshift-adp-controller-manager-7c55457f9-2b6tv.log
          8 kB
        3. oadp-velero.log
          117.01 MB
        4. oadp-report.json
          25 kB
        5. oadp-cr.json
          1 kB
        6. node-agent-xbcfr.log
          1.19 MB
        7. node-agent-vbptq.log
          1.07 MB
        8. node-agent-v9z4g.log
          1.27 MB
        9. node-agent-ncxcw.log
          1.11 MB
        10. node-agent-n87df.log
          1.22 MB
        11. node-agent-n5w9v.log
          1.21 MB
        12. node-agent-mdkcp.log
          1.22 MB
        13. node-agent-jmqkr.log
          1.14 MB
        14. node-agent-jbqjx.log
          1.23 MB
        15. node-agent-hjfvk.log
          1.19 MB
        16. node-agent-4qhpj.log
          1.30 MB
        17. node-agent-2wn5p.log
          1.19 MB
        18. benchmark_runner.log
          63 kB

              wnstb Wes Hayutin
              tzahia Tzahi Ashkenazi
              Tzahi Ashkenazi
              Tzahi Ashkenazi Tzahi Ashkenazi
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

                Created:
                Updated:
                Resolved: