Uploaded image for project: 'OpenShift API for Data Protection'
  1. OpenShift API for Data Protection
  2. OADP-3496

PodVolumeRestore is not created for empty Volume

XMLWordPrintable

    • False
    • Hide

      None

      Show
      None
    • False
    • ToDo
    • 0
    • 0
    • Very Likely
    • 0
    • None
    • Unset
    • Unknown
    • No

      Description of problem:

       
      PodVolumeRestore are not created for PodVolumeBackup with empty volume using Restic
      I see that backup and restore are successful. There are 4 podvolumebackups created  out of which 2 have following status
       

      status:     completionTimestamp: "2024-02-07T11:12:21Z"     message: volume was empty so no snapshot was taken     path: /host_pods/e7444087-ff8c-4a1b-aba2-d4fe214ac189/volumes/kubernetes.io~csi/pvc-1a1cb1f3-7866-477f-91b4-cf185177bd97/mount     phase: Completed     progress: {}     startTimestamp: "2024-02-07T11:12:20Z"

       
      Other 2 PodVolumeBackup have following status
       
       

       status:     completionTimestamp: "2024-02-07T11:11:23Z"     path: /host_pods/e7444087-ff8c-4a1b-aba2-d4fe214ac189/volumes/kubernetes.io~csi/pvc-b1196dbd-a743-4c59-ad7c-b623fa24f567/mount     phase: Completed     progress:       bytesDone: 107855491       totalBytes: 107855491     snapshotID: d4239b7b     startTimestamp: "2024-02-07T11:11:18Z"  

       
      There are only 2 PodVolumeRestore getting created
      Storage class used is ocs-storagecluster-cephfs
       
       
      Version-Release number of selected component (if applicable):
      1.3.1
       
       
      How reproducible:
       
       
       
      Steps to Reproduce:
      1. Follow these Polarian steps - https://polarion.engineering.redhat.com/polarion/redirect/project/OADP/workitem?id=OADP-266
      2. Check whether number of PodVolumeBackup equals PodVolumeRestore
       
       
      Actual results:
      Number of PodVolumeBackup does not matches with the number of PodVolumeRestores
       
       
      Expected results:
      Number of PodVolumeBackup should match with the number of PodVolumeRestores
       
       
       
      Additional info:

      apiVersion: v1
      items:
      - apiVersion: velero.io/v1
        kind: PodVolumeBackup
        metadata:
          annotations:
            velero.io/pvc-name: test-oadp-266-1
          creationTimestamp: "2024-02-07T11:11:18Z"
          generateName: mysql-75346115-c5a9-11ee-9f68-fa163eb837ae-
          generation: 3
          labels:
            velero.io/backup-name: mysql-75346115-c5a9-11ee-9f68-fa163eb837ae
            velero.io/backup-uid: 767199ca-d303-4a62-8d37-9c02b7dfcc77
            velero.io/pvc-uid: 1a1cb1f3-7866-477f-91b4-cf185177bd97
          name: mysql-75346115-c5a9-11ee-9f68-fa163eb837ae-88zlg
          namespace: openshift-adp
          ownerReferences:
          - apiVersion: velero.io/v1
            controller: true
            kind: Backup
            name: mysql-75346115-c5a9-11ee-9f68-fa163eb837ae
            uid: 767199ca-d303-4a62-8d37-9c02b7dfcc77
          resourceVersion: "4977145"
          uid: 6aa3e54b-9b3c-4a3d-b063-a73685a5e624
        spec:
          backupStorageLocation: ts-dpa-1
          node: worker-1
          pod:
            kind: Pod
            name: test-oadp-266-9ddfb9ddf-f579s
            namespace: test-oadp-266-1
            uid: e7444087-ff8c-4a1b-aba2-d4fe214ac189
          repoIdentifier: s3:s3-us-east-1.amazonaws.com/newocpbucket/velero-e2e-752f1e35-c5a9-11ee-9f68-fa163eb837ae/restic/test-oadp-266-1
          tags:
            backup: mysql-75346115-c5a9-11ee-9f68-fa163eb837ae
            backup-uid: 767199ca-d303-4a62-8d37-9c02b7dfcc77
            ns: test-oadp-266-1
            pod: test-oadp-266-9ddfb9ddf-f579s
            pod-uid: e7444087-ff8c-4a1b-aba2-d4fe214ac189
            pvc-uid: 1a1cb1f3-7866-477f-91b4-cf185177bd97
            volume: test-oadp-266-data1
          uploaderType: restic
          volume: test-oadp-266-data1
        status:
          completionTimestamp: "2024-02-07T11:12:21Z"
          message: volume was empty so no snapshot was taken
          path: /host_pods/e7444087-ff8c-4a1b-aba2-d4fe214ac189/volumes/kubernetes.io~csi/pvc-1a1cb1f3-7866-477f-91b4-cf185177bd97/mount
          phase: Completed
          progress: {}
          startTimestamp: "2024-02-07T11:12:20Z"

          There are no Sub-Tasks for this issue.

              tkaovila@redhat.com Tiger Kaovilai
              anayek Aniruddha Nayek
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated:
                Resolved: