-
Sub-task
-
Resolution: Done
-
Undefined
-
None
-
None
-
None
-
None
-
4
-
False
-
-
False
-
Passed
-
-
-
0
-
0.000
-
Very Likely
-
0
-
None
-
Unset
-
Unknown
Description of problem:
Restic restore is getting stuck, application pod is going in error state. Also podvolumeRestore doesn't have any status.
$ oc get pods -n ocp-attached-pvc NAME READY STATUS RESTARTS AGE attached-pvc-6d6f8648-cz5pr 0/1 Init:CreateContainerError 0 29s
$ oc get podvolumerestore NAME NAMESPACE POD UPLOADER TYPE VOLUME STATUS TOTALBYTES BYTESDONE AGE test-h79xg ocp-attached-pvc attached-pvc-6d6f8648-cz5pr restic testvolume 21m
Version-Release number of selected component (if applicable):
OADP 1.2
How reproducible:
Always
Failing consistently
Steps to Reproduce:
1. Deploy an application
2. Create a backup using restic (wait until successful)
3. Create a restore
Actual results:
Restore got stuck
$ oc get restore -o yaml
status:
phase: InProgress
progress:
itemsRestored: 25
totalItems: 25
startTimestamp: "2022-12-12T12:56:07Z"
$ oc get pods -n ocp-attached-pvc NAME READY STATUS RESTARTS AGE attached-pvc-6d6f8648-cz5pr 0/1 Init:CreateContainerError 0 29s
$ oc get pods -o yaml containerStatuses: - image: quay.io/openshifttest/alpine imageID: "" lastState: {} name: podtest ready: false restartCount: 0 started: false state: waiting: reason: PodInitializing hostIP: 10.0.128.2 initContainerStatuses: - image: registry.redhat.io/oadp/oadp-velero-restic-restore-helper-rhel8@sha256:5458a3f839695e3e9486c9b1e2f023d3a702f1d0c7662aebe74fd9dbb8484342 imageID: "" lastState: {} name: restore-wait ready: false restartCount: 0 state: waiting: message: | container create failed: time="2022-12-12T12:58:44Z" level=error msg="runc create failed: unable to start container process: exec: \"/velero-restore-helper\": stat /velero-restore-helper: no such file or directory" reason: CreateContainerError phase: Pending podIP: 10.131.0.68 podIPs: - ip: 10.131.0.68 qosClass: Burstable startTime: "2022-12-12T12:56:10Z"
Expected results:
Restore should be successful.
Additional info: