Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-2352

loki-operator controller pod in CrashLoopBackOff status

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Not a Bug
    • Icon: Major Major
    • Logging 5.4.0
    • Logging 5.4.0
    • Log Storage
    • None
    • False
    • None
    • False
    • NEW
    • OBSDA-7 - Adopting Loki as an alternative to Elasticsearch to support more lightweight, easier to manage/operate storage scenarios
    • VERIFIED
    • Logging (LogExp) - Sprint 215, Logging (LogExp) - Sprint 216

      Description:
      loki-operataor CrashLoopBackOff in openshift-operators-redhat
      CrashLoopBackOff

      E0311 09:51:03.052231       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope

      Version:
      quay.io/openshift-logging/loki-operator:v0.0.1
      git@github.com:openshift/loki.git->release-5.4->:operator/bundle.Dockerfile

      Step to reproduce:
      1)Deploy loki-operator into redhat-operators-redhat
      2)Check pod status.
       

             "containerStatuses": [
                  {
                      "containerID": "cri-o://080482f2f69fc84d6d002ece8f200d1120da1058b041a45eeef282524900d09c",
                      "image": "quay.io/openshift/origin-kube-rbac-proxy:latest",
                      "imageID": "quay.io/openshift/origin-kube-rbac-proxy@sha256:5b01b4dccbca6d9f3526d861b92cb64885a3bd748a508bd1228ec10170a4485c",
                      "lastState": {},
                      "name": "kube-rbac-proxy",
                      "ready": true,
                      "restartCount": 0,
                      "started": true,
                      "state": {
                          "running": {
                              "startedAt": "2022-03-11T07:24:09Z"
                          }
                      }
                  },
                  {
                      "containerID": "cri-o://5f8470fdf43463f22483a96a6268e001f66e524baadc1165628c974bda77ee60",
                      "image": "quay.io/openshift-logging/loki-operator:v0.0.1",
                      "imageID": "quay.io/openshift-logging/loki-operator@sha256:d57be0fd3a2881d7781203e4cba8541de1bc24768a34cda97911fc619dc17214",
                      "lastState": {
                          "terminated": {
                              "containerID": "cri-o://5f8470fdf43463f22483a96a6268e001f66e524baadc1165628c974bda77ee60",
                              "exitCode": 1,
                              "finishedAt": "2022-03-11T10:00:09Z",
                              "reason": "Error",
                              "startedAt": "2022-03-11T09:58:06Z"
                          }
                      },
                      "name": "manager",
                      "ready": false,
                      "restartCount": 25,
                      "started": false,
                      "state": {
                          "waiting": {
                              "message": "back-off 5m0s restarting failed container=manager pod=loki-operator-controller-manager-746c67d668-kr6gx_openshift-operators-redhat(9e5a1474-c88d-4af4-8236-752c922e2917)",
                              "reason": "CrashLoopBackOff"
                          }
                      }
                  }
              ],

      3)oc logs loki-operator-controller-manager-746c67d668-kr6gx -c kube-rbac-proxy

      I0311 07:24:09.631213       1 main.go:181] Valid token audiences: 
      I0311 07:24:09.631390       1 main.go:305] Reading certificate files
      I0311 07:24:09.631588       1 main.go:339] Starting TCP socket on 0.0.0.0:8443
      I0311 07:24:09.631874       1 main.go:346] Listening securely on 0.0.0.0:8443
      2022/03/11 07:24:30 http: TLS handshake error from 10.128.2.18:36024: remote error: tls: bad certificate
      2022/03/11 07:24:50 http: TLS handshake error from 10.131.0.12:48176: remote error: tls: bad certificate
      2022/03/11 07:25:00 http: TLS handshake error from 10.128.2.18:36434: remote error: tls: bad certificate
      2022/03/11 07:25:20 http: TLS handshake error from 10.131.0.12:48594: remote error: tls: bad certificate
      2022/03/11 07:25:30 http: TLS handshake error from 10.128.2.18:36876: remote error: tls: bad certificate
      2022/03/11 07:25:50 http: TLS handshake error from 10.131.0.12:49002: remote error: tls: bad certificate
      2022/03/11 07:26:00 http: TLS handshake error from 10.128.2.18:37272: remote error: tls: bad certificate
      2022/03/11 07:26:20 http: TLS handshake error from 10.131.0.12:49416: remote error: tls: bad certificate
      2022/03/11 07:26:30 http: TLS handshake error from 10.128.2.18:37708: remote error: tls: bad certificate
      2022/03/11 07:26:50 http: TLS handshake error from 10.131.0.12:49830: remote error: tls: bad certificate
      2022/03/11 07:27:00 http: TLS handshake error from 10.128.2.18:38114: remote error: tls: bad certificate
      2022/03/11 07:27:20 http: TLS handshake error from 10.131.0.12:50248: remote error: tls: bad certificate
      2022/03/11 07:27:30 http: TLS handshake error from 10.128.2.18:38552: remote error: tls: bad certificate
      2022/03/11 07:27:50 http: TLS handshake error from 10.131.0.12:50662: remote error: tls: bad certificate

      4)oc logs loki-operator-controller-manager-746c67d668-kr6gx -c manager

      I0311 09:51:00.549344       1 request.go:665] Waited for 1.037361244s due to client-side throttling, not priority and fairness, request: GET:https://172.30.0.1:443/apis/packages.operators.coreos.com/v1?timeout=32s
      {"_ts":"2022-03-11T09:51:02.106628355Z","_level":"0","_component":"loki-operator_controller-runtime_metrics","_message":"metrics server is starting to listen","addr":":8080"}
      {"_ts":"2022-03-11T09:51:02.107071957Z","_level":"0","_component":"loki-operator","_message":"registering metrics"}
      {"_ts":"2022-03-11T09:51:02.107140158Z","_level":"0","_component":"loki-operator","_message":"Registering profiling endpoints."}
      {"_ts":"2022-03-11T09:51:02.107168258Z","_level":"0","_component":"loki-operator","_message":"starting manager"}
      {"_ts":"2022-03-11T09:51:02.10745486Z","_level":"0","_component":"loki-operator_controller-runtime_manager","_message":"starting metrics server","path":"/metrics"}
      {"_ts":"2022-03-11T09:51:02.107592361Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"spec":{"size":"","storage":{"secret":{"type":"","name":""}},"storageClassName":"","replicationFactor":0},"status":{"components":{}},"metadata":{"creationTimestamp":null}}}}
      {"_ts":"2022-03-11T09:51:02.108146464Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null}}}}
      {"_ts":"2022-03-11T09:51:02.108253565Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null}}}}
      {"_ts":"2022-03-11T09:51:02.108372166Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"spec":{},"status":{"loadBalancer":{}}}}}
      {"_ts":"2022-03-11T09:51:02.108611167Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"spec":{"selector":null,"template":{"metadata":{"creationTimestamp":null},"spec":{"containers":null}},"strategy":{}},"status":{}}}}
      {"_ts":"2022-03-11T09:51:02.110236877Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"spec":{"selector":null,"template":{"metadata":{"creationTimestamp":null},"spec":{"containers":null}},"serviceName":"","updateStrategy":{}},"status":{"replicas":0}}}}
      {"_ts":"2022-03-11T09:51:02.110447579Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"rules":null}}}
      {"_ts":"2022-03-11T09:51:02.110566179Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"roleRef":{"apiGroup":"","kind":"","name":""}}}}
      {"_ts":"2022-03-11T09:51:02.11066658Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"rules":null}}}
      {"_ts":"2022-03-11T09:51:02.11074308Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"roleRef":{"apiGroup":"","kind":"","name":""}}}}
      {"_ts":"2022-03-11T09:51:02.110813481Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting EventSource","source":{"Type":{"metadata":{"creationTimestamp":null},"spec":{"to":{"kind":"","name":"","weight":null}},"status":{}}}}
      {"_ts":"2022-03-11T09:51:02.111443985Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Starting Controller"}
      E0311 09:51:02.116164       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.RoleBinding: failed to list *v1.RoleBinding: rolebindings.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "rolebindings" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:02.118064       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:03.052231       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:03.241454       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.RoleBinding: failed to list *v1.RoleBinding: rolebindings.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "rolebindings" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:04.890864       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.RoleBinding: failed to list *v1.RoleBinding: rolebindings.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "rolebindings" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:05.256217       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:08.678920       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:10.847446       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.RoleBinding: failed to list *v1.RoleBinding: rolebindings.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "rolebindings" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:17.327105       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:18.526533       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.RoleBinding: failed to list *v1.RoleBinding: rolebindings.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "rolebindings" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:31.239200       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:51:32.554623       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.RoleBinding: failed to list *v1.RoleBinding: rolebindings.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "rolebindings" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:52:05.653742       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:52:21.839936       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.RoleBinding: failed to list *v1.RoleBinding: rolebindings.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "rolebindings" in API group "rbac.authorization.k8s.io" at the cluster scope
      E0311 09:52:55.672460       1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.22.1/tools/cache/reflector.go:167: Failed to watch *v1.Role: failed to list *v1.Role: roles.rbac.authorization.k8s.io is forbidden: User "system:serviceaccount:openshift-operators-redhat:default" cannot list resource "roles" in API group "rbac.authorization.k8s.io" at the cluster scope
      {"_ts":"2022-03-11T09:53:02.112551505Z","_level":"0","_component":"loki-operator_controller-runtime_manager_controller_lokistack","_message":"Could not wait for Cache to sync","_error":{"msg":"failed to wait for lokistack caches to sync: timed out waiting for cache to be synced"}}
      {"_ts":"2022-03-11T09:53:02.11347821Z","_level":"0","_component":"loki-operator","_message":"problem running manager","_error":{"msg":"failed to wait for lokistack caches to sync: timed out waiting for cache to be synced"}}

              sasagarw@redhat.com Sashank Agarwal (Inactive)
              rhn-support-anli Anping Li
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

                Created:
                Updated:
                Resolved: