-
Bug
-
Resolution: Duplicate
-
Undefined
-
None
-
2.6.3
-
None
-
False
-
None
-
False
-
-
I have 2 SNO environments configured identically, both have the same set of operators:
- LVM Storage Operator providing the storage
- Openshift Virtualization
- Openshift Migration Toolkit for Virtualization
Both have the same NNCP and NAD configurations and both have the same Project names.
I have 2 VMs running on SNO1 created from the RHEL9 Template we supply with only the following changes:
- Connected NIC to a VLAN configured NAD
- Pod Network NIC deleted
- Configured cloud-user password and Network Config in scripts
- SSH key injected
So they only have the 30Gi storage disk that is created as part of the template. I am trying to migrate these from SNO1 to SNO2.
I create the Migration Plan on SNO2 and when I execute the plan, it stays in a running state forever. When I check the destination namespace on SNO2 and examine the importer-prime pods, after some time, they Error out with the following error message:
I0724 21:45:18.008179 1 data-processor.go:259] New phase: TransferScratch I0724 21:45:18.008282 1 util.go:194] Writing data... E0724 21:48:30.325314 1 util.go:196] Unable to write file from dataReader: write /scratch/tmpimage: no space left on device E0724 21:48:30.404770 1 data-processor.go:255] write /scratch/tmpimage: no space left on device unable to write to file kubevirt.io/containerized-data-importer/pkg/util.StreamDataToFile /remote-source/app/pkg/util/util.go:198 kubevirt.io/containerized-data-importer/pkg/importer.(*HTTPDataSource).Transfer /remote-source/app/pkg/importer/http-datasource.go:157 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).initDefaultPhases.func2 /remote-source/app/pkg/importer/data-processor.go:185 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessDataWithPause /remote-source/app/pkg/importer/data-processor.go:252 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessData /remote-source/app/pkg/importer/data-processor.go:161 main.handleImport
Spoke with Arik & Alexander on Slack #forum-mig-mtv and opening this issue as part of the discussion with them.