Uploaded image for project: 'Fast Datapath Product'
  1. Fast Datapath Product
  2. FDP-1857

QE verification: Stale sample from allow ACL are generated after the ACL is removed and packet is dropped

    • Icon: Task Task
    • Resolution: Unresolved
    • Icon: Normal Normal
    • None
    • None
    • ovn24.09
    • False
    • Hide

      None

      Show
      None
    • False
    • Hide

      ( ) The bug has been reproduced and verified by QE members
      ( ) Test coverage has been added to downstream CI
      ( ) For new feature, failed test plans have bugs added as children to the epic
      ( ) The bug is cloned to any relevant release that we support and/or is needed

      Show
      ( ) The bug has been reproduced and verified by QE members ( ) Test coverage has been added to downstream CI ( ) For new feature, failed test plans have bugs added as children to the epic ( ) The bug is cloned to any relevant release that we support and/or is needed
    • rhel-9
    • None

      This ticket is tracking the QE verification effort for the solution to the problem described below.
      Apply 2 ACL: drop and allow with higher prio. Start a long-lived connection. Remove allow ACL, watch stale allow sample still being generated even when the packet is dropped.

       

      Here is db config

       

      #acls
      
      _uuid               : 6d3c12fd-6e87-4438-b8b8-fc2e149ed2d4
      action              : drop
      direction           : from-lport
      external_ids        : {direction=Egress, "k8s.ovn.org/id"="default-network-controller:NetpolNamespace:iperf:Egress:defaultDeny", "k8s.ovn.org/name"=iperf, "k8s.ovn.org/owner-controller"=default-network-controller, "k8s.ovn.org/owner-type"=NetpolNamespace, type=defaultDeny}
      label               : 0
      log                 : false
      match               : "inport == @a9685538327780762322"
      meter               : acl-logging
      name                : "NP:iperf:Egress"
      options             : {apply-after-lb="true"}
      priority            : 1000
      sample_est          : d9193b7b-85de-43b3-a4ab-d9a7e1d569ce
      sample_new          : d9193b7b-85de-43b3-a4ab-d9a7e1d569ce
      severity            : []
      tier                : 2
      
      _uuid               : eb4bef4c-ebc9-4441-a075-d0bab4990c21
      action              : allow-related
      direction           : from-lport
      external_ids        : {direction=Egress, gress-index="0", ip-block-index="-1", "k8s.ovn.org/id"="default-network-controller:NetworkPolicy:iperf:iperf3-server-access-egress:Egress:0:tcp:-1", "k8s.ovn.org/name"="iperf:iperf3-server-access-egress", "k8s.ovn.org/owner-controller"=default-network-controller, "k8s.ovn.org/owner-type"=NetworkPolicy, port-policy-protocol=tcp}
      label               : 0
      log                 : false
      match               : "ip4 && tcp && tcp.dst==8080 && inport == @a13584117918772089294"
      meter               : acl-logging
      name                : "NP:iperf:iperf3-server-access-egress:Egress:0"
      options             : {apply-after-lb="true"}
      priority            : 1001
      sample_est          : e5801939-cb50-4678-a336-ce4a8508e1cd
      sample_new          : e5801939-cb50-4678-a336-ce4a8508e1cd
      severity            : []
      tier                : 2
      
      #samples
      
      _uuid               : d9193b7b-85de-43b3-a4ab-d9a7e1d569ce
      collectors          : [854e8614-759d-4739-bde9-5bcb08833406]
      metadata            : 4007437751
      
      _uuid               : e5801939-cb50-4678-a336-ce4a8508e1cd
      collectors          : [854e8614-759d-4739-bde9-5bcb08833406]
      metadata            : 3508384340
      
      # ovn-nbctl find sample_collector
      _uuid               : 854e8614-759d-4739-bde9-5bcb08833406
      external_ids        : {sample-features="AdminNetworkPolicy,EgressFirewall,Multicast,NetworkPolicy,UDNIsolation"}
      id                  : 2
      name                : ""
      probability         : 65535
      set_id              : 42
      
      # ovs-vsctl list FLow_Sample_Collector_Set
      _uuid               : fe157dee-5786-43d6-918c-caebaec83039
      bridge              : 2591c1e9-973f-43da-9f23-c4c610902651
      external_ids        : {owner=netobservAgent}
      id                  : 42
      ipfix               : []

      Then I see lots of samples with the same allow-acl in the same second (08:29:45 in this case) when the allow acl is deleted, but later same sample also come with larger intervals:

      2024/10/02 08:29:45.972600 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:45.972603 decoding failed: find sample failed: object not found
      2024/10/02 08:29:45.972604 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:46.380503 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:46.380516 decoding failed: find sample failed: object not found
      2024/10/02 08:29:46.380518 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:46.380543 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:46.380548 decoding failed: find sample failed: object not found
      2024/10/02 08:29:46.380550 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:46.380568 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:46.380572 decoding failed: find sample failed: object not found
      2024/10/02 08:29:46.380574 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:47.236519 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:47.236539 decoding failed: find sample failed: object not found
      2024/10/02 08:29:47.236541 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:47.236575 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:47.236579 decoding failed: find sample failed: object not found
      2024/10/02 08:29:47.236580 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:47.236604 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:47.236608 decoding failed: find sample failed: object not found
      2024/10/02 08:29:47.236609 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:48.900507 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:48.900521 decoding failed: find sample failed: object not found
      2024/10/02 08:29:48.900523 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:48.900551 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:48.900555 decoding failed: find sample failed: object not found
      2024/10/02 08:29:48.900557 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:48.900582 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:48.900589 decoding failed: find sample failed: object not found
      2024/10/02 08:29:48.900593 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:52.164502 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:52.164518 decoding failed: find sample failed: object not found
      2024/10/02 08:29:52.164519 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:52.164547 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:52.164552 decoding failed: find sample failed: object not found
      2024/10/02 08:29:52.164555 src=10.131.0.92, dst=10.131.0.93

      2024/10/02 08:29:52.164576 group_id=10, obs_domain=50331650, obs_point=3508384340
      2024/10/02 08:29:52.164582 decoding failed: find sample failed: object not found
      2024/10/02 08:29:52.164583 src=10.131.0.92, dst=10.131.0.93

       

       

              ovnteam@redhat.com OVN Team
              nstbot NST Bot
              OVN QE OVN QE
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated: