-
Bug
-
Resolution: Done
-
Undefined
-
None
-
None
-
None
-
False
-
-
False
-
?
-
?
-
?
-
?
-
None
-
-
-
Moderate
Failing to deploy Ironic based on the configuration from the beta docs.
2024-07-10 17:57:23.432 1 ERROR nova.virt.ironic.driver [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] An unknown error has occurred when trying to get the list of nodes from the Ironic inventory. Error: ['internal', 'public'] endpoint for baremetal service not found: keystoneauth1.exceptions.catalog.EndpointNotFound: ['internal', 'public'] endpoint for baremetal service not found
2024-07-10 17:57:23.432 1 WARNING nova.compute.manager [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] Virt driver is not ready.: nova.exception.VirtDriverNotReady: Virt driver is not ready.
2024-07-10 17:57:26.433 1 DEBUG oslo_service.periodic_task [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2024-07-10 17:57:27.136 1 DEBUG oslo_service.periodic_task [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2024-07-10 17:57:29.135 1 DEBUG oslo_service.periodic_task [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2024-07-10 17:57:29.146 1 DEBUG oslo_service.periodic_task [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2024-07-10 17:57:29.146 1 DEBUG nova.compute.manager [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10442
2024-07-10 17:57:30.139 1 DEBUG oslo_service.periodic_task [None req-4a5a43dd-5b18-4fd7-a279-7315635e28e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
My config:
ironic:
enabled: true
template:
rpcTransport: oslo
databaseInstance: openstack
ironicAPI:
replicas: 1
override:
service:
internal:
metadata:
annotations:
metallb.universe.tf/address-pool: internalapi
metallb.universe.tf/allow-shared-ip: internalapi
metallb.universe.tf/loadBalancerIPs: 172.41.11.80
spec:
type: LoadBalancer
ironicConductors:
- replicas: 1
storageRequest: 10G
networkAttachments:
- baremetal 1
provisionNetwork: baremetal
customServiceConfig: |
[neutron]
cleaning_network = provisioning
provisioning_network = provisioning
rescuing_network = provisioning
ironicInspector:
replicas: 0
networkAttachments:
- baremetal
inspectionNetwork: baremetal
ironicNeutronAgent:
replicas: 1
secret: osp-secret
also added this to nova:
nova: apiOverride: route: {} template: secret: osp-secret cellTemplates: cell0: cellDatabaseUser: nova_cell0 hasAPIAccess: true cell1: cellDatabaseUser: nova_cell1 cellDatabaseInstance: openstack-cell1 cellMessageBusInstance: rabbitmq-cell1 hasAPIAccess: true novaComputeTemplates: compute-ironic: computeDriver: ironic.IronicDriver
Getting the templates from beta docs:
https://docs.redhat.com/en/documentation/red_hat_openstack_services_on_openshift/18.0-beta/html/deploying_red_hat_openstack_services_on_openshift/assembly_creating-the-control-plane#proc_adding-the-bare-metal-provisioning-service-ironic-to-the-control-plane_controlplane
I also attempted following config:
https://github.com/openstack-k8s-operators/architecture/pull/211/files#diff-4282284ac87d1c828ecc9ae6da0072ec85b20cdb2f52b8c8c83c69aaae82b253R304
But when applying that I am getting following error from the ironic operator:
2024-07-10T19:50:42Z INFO Controllers.Ironic Reconciling Service {"controller": "ironic", "controllerGroup": "ironic.openstack.org", "controllerKind": "Ironic", "Ironic":
, "namespace": "openstack", "name": "ironic", "reconcileID": "5e9ff747-adcb-4311-bc29-31b8386cdb7e"}
2024-07-10T19:50:42Z ERROR Reconciler error {"controller": "ironic", "controllerGroup": "ironic.openstack.org", "controllerKind": "Ironic", "Ironic":
, "namespace": "openstack", "name": "ironic", "reconcileID": "5e9ff747-adcb-4311-bc29-31b8386cdb7e", "error": "accountName is empty"}
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler
/remote-source/deps/gomod/pkg/mod/sigs.k8s.io/controller-runtime@v0.16.5/pkg/internal/controller/controller.go:329
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem
/remote-source/deps/gomod/pkg/mod/sigs.k8s.io/controller-runtime@v0.16.5/pkg/internal/controller/controller.go:266
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2
/remote-source/deps/gomod/pkg/mod/sigs.k8s.io/controller-runtime@v0.16.5/pkg/internal/controller/controller.go:227
Some more info:
cjanisze@p1g4:~/Documents/projects/ROSO$ oc rsh -n openstack openstackclient
sh-5.1$ openstack endpoint list
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
ID | Region | Service Name | Service Type | Enabled | Interface | URL |
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
0e54851badde4724be7eec77830a292a | regionOne | neutron | network | True | public | https://neutron-public-openstack.apps.osphackfest.sand4.auto.bos2.lab |
0e8bed4a5c5d4f10b4cc8cd7ec5f8622 | regionOne | placement | placement | True | public | https://placement-public-openstack.apps.osphackfest.sand4.auto.bos2.lab |
12b1d70086d44b6c958397ca797410de | regionOne | barbican | key-manager | True | public | https://barbican-public-openstack.apps.osphackfest.sand4.auto.bos2.lab |
19be6407294e4957b217267538f69dfa | regionOne | barbican | key-manager | True | internal | https://barbican-internal.openstack.svc:9311 |
21e0e500731b46b697e7166ebceb4277 | regionOne | glance | image | True | public | https://glance-default-public-openstack.apps.osphackfest.sand4.auto.bos2.lab |
29a25c50e97f4cabb61a367cf69c535e | regionOne | neutron | network | True | internal | https://neutron-internal.openstack.svc:9696 |
3b6c872ed762494bba701fd2fb6db01d | regionOne | glance | image | True | internal | https://glance-default-internal.openstack.svc:9292 |
5726e0eef8ec4e68af97bacc6e185618 | regionOne | keystone | identity | True | internal | https://keystone-internal.openstack.svc:5000 |
6b7f79eb1a48452dbf0d7154c4f77725 | regionOne | cinderv3 | volumev3 | True | internal | https://cinder-internal.openstack.svc:8776/v3 |
714ddad9d8f34e8d9b9fecfab8ad26c0 | regionOne | nova | compute | True | public | https://nova-public-openstack.apps.osphackfest.sand4.auto.bos2.lab/v2.1 |
8556d05323ee417da7a6c187392febf5 | regionOne | cinderv3 | volumev3 | True | public | https://cinder-public-openstack.apps.osphackfest.sand4.auto.bos2.lab/v3 |
8ef7a0f9c7924ea4bf5ba083a70e4a36 | regionOne | keystone | identity | True | public | https://keystone-public-openstack.apps.osphackfest.sand4.auto.bos2.lab |
acea587babc0418d98b50b4806d0f828 | regionOne | swift | object-store | True | internal | https://swift-internal.openstack.svc:8080/v1/AUTH_%(tenant_id)s |
bd85910098624782aea971ec06b24f6b | regionOne | nova | compute | True | internal | https://nova-internal.openstack.svc:8774/v2.1 |
f4cd102cb5154bd4b6111b773a9713a2 | regionOne | swift | object-store | True | public | https://swift-public-openstack.apps.osphackfest.sand4.auto.bos2.lab/v1/AUTH_%(tenant_id)s |
f5d60f84f0844d1e8ea83d29b4a01516 | regionOne | placement | placement | True | internal | https://placement-internal.openstack.svc:8778 |
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Logs attached.