Uploaded image for project: 'OpenShift Bugs'
  1. OpenShift Bugs
  2. OCPBUGS-19038

Kube-apiserver audits logs are frequently logging "Forbidden" events from AMQ operator related service accounts

XMLWordPrintable

    • No
    • False
    • Hide

      None

      Show
      None

      Description of problem:

      I am opening this bug regarding the bug which i opened to AMQ team https://issues.redhat.com/browse/ENTMQST-4982
      
      Issue:
      
      The AMQ stream operator related service accounts are frequently reporting 403 forbidden error in the kube-apiserver audit logs. As per the AMQ team those serviceaccounts are not intended to do the specific action or its not having the privilege to do so.
      Basically those service accounts are trying to do a "get" operations on the AMQ related pods.
      
      Please see the events below.
      
      ~~~
      SA monitor-kafka
      
      {"kind":"Event","apiVersion":"audit.k8s.io/v1","level":"Metadata","auditID":"89c7baa9-9334-4bba-9d8c-51500083840e","stage":"ResponseComplete","requestURI":"/api/v1/namespaces/amqstreams-prod-bck/pods/monitor-kafka-1","verb":"get","user":{"username":"system:serviceaccount:amqstreams-prod-bck:monitor-kafka","uid":"d76268fe-300b-4675-843a-4657505d98dc","groups":["system:serviceaccounts","system:serviceaccounts:amqstreams-prod-bck","system:authenticated"],"extra":{"authentication.kubernetes.io/pod-name":["monitor-kafka-1"],"authentication.kubernetes.io/pod-uid":["19d9e1f3-225c-4fe6-bba1-932548c9e2fe"]}},"sourceIPs":["10.149.192.51"],"objectRef":{"resource":"pods","namespace":"amqstreams-prod-bck","name":"monitor-kafka-1","apiVersion":"v1"},"responseStatus":{"metadata":{},"status":"Failure","reason":"Forbidden","code":403},"requestReceivedTimestamp":"2023-04-13T15:20:32.508580Z","stageTimestamp":"2023-04-13T15:20:32.509368Z","annotations":{"authorization.k8s.io/decision":"forbid","authorization.k8s.io/reason":""}}
      
      SA monitor-zookeeper
      
      {"kind":"Event","apiVersion":"audit.k8s.io/v1","level":"Metadata","auditID":"c9e632cc-8564-4cbf-99e8-a8b800f81ae1","stage":"ResponseComplete","requestURI":"/api/v1/namespaces/amqstreams-prod-bck/pods/monitor-zookeeper-0","verb":"get","user":{"username":"system:serviceaccount:amqstreams-prod-bck:monitor-zookeeper","uid":"958541d7-5462-4b62-a808-7145c3d5254d","groups":["system:serviceaccounts","system:serviceaccounts:amqstreams-prod-bck","system:authenticated"],"extra":{"authentication.kubernetes.io/pod-name":["monitor-zookeeper-0"],"authentication.kubernetes.io/pod-uid":["22c3c6a5-1982-48bd-80f7-c6cc977f7f33"]}},"sourceIPs":["10.149.192.51"],"objectRef":{"resource":"pods","namespace":"amqstreams-prod-bck","name":"monitor-zookeeper-0","apiVersion":"v1"},"responseStatus":{"metadata":{},"status":"Failure","reason":"Forbidden","code":403},"requestReceivedTimestamp":"2023-04-13T15:20:11.208659Z","stageTimestamp":"2023-04-13T15:20:11.210262Z","annotations":{"authorization.k8s.io/decision":"forbid","authorization.k8s.io/reason":""}}
      
      SA monitor-splunk-connect
      
      {"kind":"Event","apiVersion":"audit.k8s.io/v1","level":"Metadata","auditID":"4abed53b-0d4e-4de0-ab79-cd7794a8e377","stage":"ResponseComplete","requestURI":"/api/v1/namespaces/amqstreams-prod-bck/pods/monitor-splunk-connect-84b64b55df-w62kf","verb":"get","user":{"username":"system:serviceaccount:amqstreams-prod-bck:monitor-splunk-connect","uid":"f8a80323-22d8-4550-928a-3505da5bbcc4","groups":["system:serviceaccounts","system:serviceaccounts:amqstreams-prod-bck","system:authenticated"],"extra":{"authentication.kubernetes.io/pod-name":["monitor-splunk-connect-84b64b55df-w62kf"],"authentication.kubernetes.io/pod-uid":["257ad32a-9fae-48d6-980a-538ca0366a9c"]}},"sourceIPs":["10.149.192.34"],"objectRef":{"resource":"pods","namespace":"amqstreams-prod-bck","name":"monitor-splunk-connect-84b64b55df-w62kf","apiVersion":"v1"},"responseStatus":{"metadata":{},"status":"Failure","reason":"Forbidden","code":403},"requestReceivedTimestamp":"2023-04-13T15:17:26.127962Z","stageTimestamp":"2023-04-13T15:17:26.128894Z","annotations":{"authorization.k8s.io/decision":"forbid","authorization.k8s.io/reason":""}}
      
      SA monitor-entity-operator
      
      {"kind":"Event","apiVersion":"audit.k8s.io/v1","level":"Metadata","auditID":"2ce2c300-b855-42d9-89db-7e89c0ccaa39","stage":"ResponseComplete","requestURI":"/api/v1/namespaces/amqstreams-prod-bck/pods/monitor-entity-operator-5b95469495-bmf7m","verb":"get","user":{"username":"system:serviceaccount:amqstreams-prod-bck:monitor-entity-operator","uid":"7d2f348d-e6a4-40d7-9e74-d541b8e03b72","groups":["system:serviceaccounts","system:serviceaccounts:amqstreams-prod-bck","system:authenticated"],"extra":{"authentication.kubernetes.io/pod-name":["monitor-entity-operator-5b95469495-bmf7m"],"authentication.kubernetes.io/pod-uid":["49fba0b7-8724-4741-b98d-991f8c0ef04c"]}},"sourceIPs":["10.149.192.82"],"objectRef":{"resource":"pods","namespace":"amqstreams-prod-bck","name":"monitor-entity-operator-5b95469495-bmf7m","apiVersion":"v1"},"responseStatus":{"metadata":{},"status":"Failure","reason":"Forbidden","code":403},"requestReceivedTimestamp":"2023-04-13T15:15:08.394830Z","stageTimestamp":"2023-04-13T15:15:08.395840Z","annotations":{"authorization.k8s.io/decision":"forbid","authorization.k8s.io/reason":""}}
      ~~~
      
      Below are the status of the issue, Customer is facing the similar behavior in two clusters.
      
      ~~~
      - The issue status during the OCP upgrade 
      
      The cluster "pre-prod" : 4.10.32  --> 4.10.45 (issue resolved) ---> 4.10.51 (issue came back)
      
      The cluster "prod" : 4.10.32  --> 4.10.51  they have not upgraded to intermediate ocp version so issue is always there
      
      - The issue status during the AMQ upgrade
      
      The cluster "pre-prod" : AMQ from v2.2.0-4 to v2.2.1-5 -- issue not there (ocp version is still 4.10.51)
      
      The cluster "prod" : AMQ from v2.2.0-4 to v2.2.1-5 -- issue still there (ocp version is still 4.10.51)
      ~~~
      
      - Every day customer is getting around 576 forbidden events and it happens every 15 minutes exactly (as per customer)
      
      I am just opening this bug to get some help on this issue as AMQ team cant help much here. The requirement is to identify why the SA's are performing this action while its not intended to do so. 
      more details are available in the bug https://issues.redhat.com/browse/ENTMQST-4982

      Version-Release number of selected component (if applicable):

       

      How reproducible:

      Its not reproducible.

      Steps to Reproduce:

      1.
      2.
      3.
      

      Actual results:

       

      Expected results:

       

      Additional info:

       

            Unassigned Unassigned
            rhn-support-amuhamme MUHAMMED ASLAM V K
            Ke Wang Ke Wang
            MUHAMMED ASLAM V K
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

              Created:
              Updated:
              Resolved: