Uploaded image for project: 'OpenShift Bugs'
  1. OpenShift Bugs
  2. OCPBUGS-65860

Cluster Autoscaler scales down only 1 node at a time despite identifying multiple empty nodes

XMLWordPrintable

    • None
    • False
    • Hide

      None

      Show
      None
    • 3
    • None
    • None
    • None
    • None
    • None
    • AUTOSCALE - Sprint 281
    • 1
    • None
    • None
    • None
    • None
    • None
    • None
    • None

      Description of problem:

      When cluster autoscaler (CAS) identifies multiple empty nodes for scale-down in ARO Classic clusters, only 1 machine gets deleted per reconcile cycle (~5 minutes), causing slow scale-down. Despite CAS identifying 10 empty nodes simultaneously and attempting to delete them, the ClusterAPI provider's sequential SetSize() implementation results in one-at-a-time deletion.

      Version-Release number of selected component (if applicable):

      ARO: 4.17.27
      ClusterAutoScaler: 1.30.1

      How reproducible:

      Always 

      Steps to Reproduce:

      1. In ARO Classic Cluster, create a Min:1 Max: 80(could be any number) Machineset
      2. Create workload to make machines to scale up to a huge number
      3. Remove workloads, trigger CA to scale down the machineset replicas to 1 
      4. Check Multiple node has taint ToBeDeletedByClusterAutoscaler, and multiple machines have machine.openshift.io/delete-machine annotation
      6. Observe only one machines is deleting by the machine-controller 

      Actual results:

      only one machines is deleting by the machine-controller 

      Expected results:

      CA should do batch deleting with --max-empty-bulk-delete settings, the default is 10. ARO Classic expected to use the default value 

      Additional info:

       # Machineset controller logs Indicated it only delete one at a time 
      
      2025-11-19T14:13:23.800118316Z I1119 14:13:23.800047       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 67, deleting 1
      2025-11-19T14:56:42.134422697Z I1119 14:56:42.134344       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 66, deleting 1
      2025-11-19T15:01:36.126064772Z I1119 15:01:36.125924       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 65, deleting 1
      2025-11-19T15:07:35.619152725Z I1119 15:07:35.619085       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 64, deleting 1
      2025-11-19T15:12:34.288344097Z I1119 15:12:34.288279       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 63, deleting 1
      2025-11-19T15:17:26.677685749Z I1119 15:17:26.677618       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 62, deleting 1
      2025-11-19T15:22:17.308617581Z I1119 15:22:17.308553       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 61, deleting 1
      2025-11-19T15:27:08.221157785Z I1119 15:27:08.221084       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 60, deleting 1
      2025-11-19T15:33:12.911794410Z I1119 15:33:12.911715       1 controller.go:308] Too many replicas for machine.openshift.io/v1beta1, Kind=MachineSet openshift-machine-api/poc-dpaas-ibm-dedicated-eastus21, need 59, deleting 1

      In the mean time , CAS identify 10 empty nodes, we would expect them to be batch deleted 

      2025-11-19T15:30:47.774104017Z I1119 15:30:47.774029       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-69cq9
      2025-11-19T15:30:47.774104017Z I1119 15:30:47.774065       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-4knnt
      2025-11-19T15:30:47.774178823Z I1119 15:30:47.774103       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-57k7f
      2025-11-19T15:30:47.774199925Z I1119 15:30:47.774075       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-vm9fv
      2025-11-19T15:30:47.774217326Z I1119 15:30:47.774178       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-42ktt
      2025-11-19T15:30:47.774347136Z I1119 15:30:47.774306       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-dvfj4
      2025-11-19T15:30:47.774366837Z I1119 15:30:47.774039       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-6qcs4
      2025-11-19T15:30:47.774383939Z I1119 15:30:47.774354       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-42s2r
      2025-11-19T15:30:47.774400640Z I1119 15:30:47.774306       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-g659n
      2025-11-19T15:30:47.774456144Z I1119 15:30:47.774038       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-q5dgv
      2025-11-19T15:31:55.057438805Z I1119 15:31:55.057340       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-787lx"
      2025-11-19T15:31:55.057649122Z I1119 15:31:55.057603       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-q5dgv"
      2025-11-19T15:31:55.057928744Z I1119 15:31:55.057892       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-sfgtx"
      2025-11-19T15:31:55.058207865Z I1119 15:31:55.058172       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-vm9fv"
      2025-11-19T15:31:55.058434783Z I1119 15:31:55.058405       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-578rg"
      2025-11-19T15:31:55.058638899Z I1119 15:31:55.058600       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-42s2r"
      2025-11-19T15:31:55.058828414Z I1119 15:31:55.058800       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-42ktt"
      2025-11-19T15:31:55.059019629Z I1119 15:31:55.058992       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-bdmnb"
      2025-11-19T15:31:55.059211144Z I1119 15:31:55.059184       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-djrk7"
      2025-11-19T15:31:55.059400859Z I1119 15:31:55.059374       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-4knnt"
      2025-11-19T15:32:00.062591188Z I1119 15:32:00.062499       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-787lx
      2025-11-19T15:32:00.062591188Z I1119 15:32:00.062542       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-sfgtx
      2025-11-19T15:32:00.062591188Z I1119 15:32:00.062562       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-4knnt
      2025-11-19T15:32:00.062686596Z I1119 15:32:00.062592       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-bdmnb
      2025-11-19T15:32:00.062686596Z I1119 15:32:00.062625       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-42s2r
      2025-11-19T15:32:00.062712698Z I1119 15:32:00.062564       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-q5dgv
      2025-11-19T15:32:00.062739200Z I1119 15:32:00.062701       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-djrk7
      2025-11-19T15:32:00.062779003Z I1119 15:32:00.062727       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-vm9fv
      2025-11-19T15:32:00.062797604Z I1119 15:32:00.062776       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-42ktt
      2025-11-19T15:32:00.062814506Z I1119 15:32:00.062747       1 drain.go:131] All pods removed from poc-dpaas-ibm-dedicated-eastus21-578rg
      2025-11-19T15:33:07.422597208Z I1119 15:33:07.422439       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-vq2mb"
      2025-11-19T15:33:07.422837526Z I1119 15:33:07.422808       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-bd8gx"
      2025-11-19T15:33:07.423131148Z I1119 15:33:07.423108       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-9csgz"
      2025-11-19T15:33:07.423383467Z I1119 15:33:07.423360       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-pxg8p"
      2025-11-19T15:33:07.423603484Z I1119 15:33:07.423583       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-lwxkq"
      2025-11-19T15:33:07.423813299Z I1119 15:33:07.423792       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-787lx"
      2025-11-19T15:33:07.424013114Z I1119 15:33:07.423992       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-69cq9"
      2025-11-19T15:33:07.424207329Z I1119 15:33:07.424187       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-jnkft"
      2025-11-19T15:33:07.424394643Z I1119 15:33:07.424374       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-6wk6v"
      2025-11-19T15:33:07.424581357Z I1119 15:33:07.424561       1 actuator.go:147] Scale-down: removing empty node "poc-dpaas-ibm-dedicated-eastus21-wqsft"   

       

              rh-ee-prozehna Paul Rozehnal
              rhn-support-judzhu Jude Zhu
              None
              None
              Paul Rozehnal Paul Rozehnal
              None
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated: