Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-2093

EO Self-generated certificates issue with Kibana when "logging.openshift.io/elasticsearch-cert-management: true" annotation is used

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • Logging 5.4.0
    • Logging 5.4.0
    • Log Storage
    • None
    • False
    • False
    • NEW
    • VERIFIED
    • Logging (LogExp) - Sprint 212

      Description

      KIbana can't establish connection to the Elasticsearch with error:

      {"type":"log","@timestamp":"2021-12-21T15:24:17Z","tags":["warning","elasticsearch","admin"],"pid":116,"message":"Unable to revive connection: https://elasticsearch.openshift-logging.svc:9200/"}
      {"type":"log","@timestamp":"2021-12-21T15:24:17Z","tags":["warning","elasticsearch","admin"],"pid":116,"message":"No living connections"}
      

      In same time in Elasticsearch/proxy container got error:

      2021/12/21 15:33:51 http: TLS handshake error from 10.131.0.119:44492: tls: failed to verify client certificate: x509: certificate signed by unknown authority (possibly because of "crypto/rsa: verification error" while trying to verify candidate authority certificate "Logging Signing CA")
      

      In Elasticsearch operator pod logs:

      {"_ts":"2021-12-21T14:24:30.262503062Z","_level":"0","_component":"elasticsearch-operator_controller_kibana-controller","_message":"Reconciler error","_error":{"msg":"did not receive hashvalue for trusted CA value"},"name":"kibana","namespace":"openshift-logging"}
      

      Route Kibana also unavailable.

      How to reproduce

      Deploy CLO from PR: pull/1265

      Workaround:

      oc delete deployment/kibana secret/kibana secret/kibana-proxy -n openshift-logging 
      

      after recreating deployment and secrets, kibana pod can connect to the Elasticsearch and router works well.

      Possible issue

      A possible problem can be in race-condition for certificate generation, looks like Kibana and Elasticsearch certificates signed by different CA (signing-elasticsearch secret updated several times).

            [LOG-2093] EO Self-generated certificates issue with Kibana when "logging.openshift.io/elasticsearch-cert-management: true" annotation is used

            Anping Li added a comment -

            Verified on cluster-logging.5.4.0-40,elasticsearch-operator.5.4.0-53. The elasticsearch and kibana feature works well

            Anping Li added a comment - Verified on cluster-logging.5.4.0-40,elasticsearch-operator.5.4.0-53. The elasticsearch and kibana feature works well

            Dear Sender,

            Thank you for contacting Red Hat. Your message was not delivered to the original recipient. Please direct any future correspondence to Jonathan Suber - jsuber@redhat.com

            Best regards.

            Igor Karpukhin (Inactive) added a comment - Dear Sender, Thank you for contacting Red Hat. Your message was not delivered to the original recipient. Please direct any future correspondence to Jonathan Suber - jsuber@redhat.com Best regards.

            Jeffrey Cantrill added a comment - - edited

            spad09  gvanloo  we should backport this to 5.3 to assist upgrade scenarios where CLO is upgraded before EO.  Cloning for 5.3

            Jeffrey Cantrill added a comment - - edited spad09   gvanloo   we should backport this to 5.3 to assist upgrade scenarios where CLO is upgraded before EO.  Cloning for 5.3

            Igor Karpukhin (Inactive) added a comment - https://github.com/openshift/elasticsearch-operator/pull/822

            vparfono and I found the problem. There is a race condition between GenerateComponentsCerts and GenerateKibanaCerts methods. Solved by adding a mutex for both these functions.

            Igor Karpukhin (Inactive) added a comment - vparfono and I found the problem. There is a race condition between GenerateComponentsCerts and GenerateKibanaCerts methods. Solved by adding a mutex for both these functions.

              ikarpukh Igor Karpukhin (Inactive)
              vparfono Vitalii Parfonov
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

                Created:
                Updated:
                Resolved: