Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-1200

Forwarding logs to Kafka using Chained certificates fails with error "state=error: certificate verify failed (unable to get local issuer certificate)"

    XMLWordPrintable

Details

    • False
    • False
    • NEW
    • VERIFIED
    • Undefined
    • Logging (Core) - Sprint 198, Logging (Core) - Sprint 199, Logging (Core) - Sprint 200

    Description

      https://bugzilla.redhat.com/show_bug.cgi?id=1904380

      [Description of problem]
      Configuring to forward logs to Kafka using TLS with a Chained certificate (Root CA + Intermediate certificate) is possible seeing in the fluentd pods the next error:

      ~~~
      2020-11-27 12:26:36 +0000 [warn]: suppressed same stacktrace
      2020-11-27 12:26:36 +0000 [warn]: Send exception occurred: SSL_connect returned=1 errno=0 state=error: certificate verify failed (unable to get local issuer certificate)
      2020-11-27 12:26:36 +0000 [warn]: Exception Backtrace : /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/ssl_socket_with_timeout.rb:69:in `connect_nonblock'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/ssl_socket_with_timeout.rb:69:in `initialize'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/connection.rb:130:in `new'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/connection.rb:130:in `open'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/connection.rb:101:in `block in send_request'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/instrumenter.rb:23:in `instrument'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/connection.rb:100:in `send_request'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/broker.rb:200:in `send_request'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/broker.rb:44:in `fetch_metadata'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/cluster.rb:427:in `block in fetch_cluster_info'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/cluster.rb:422:in `each'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/cluster.rb:422:in `fetch_cluster_info'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/cluster.rb:402:in `cluster_info'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/cluster.rb:102:in `refresh_metadata!'
      /usr/local/share/gems/gems/ruby-kafka-1.1.0/lib/kafka/cluster.rb:56:in `add_target_topics'
      /usr/local/share/gems/gems/fluent-plugin-kafka-0.13.1/lib/fluent/plugin/kafka_producer_ext.rb:91:in `initialize'
      /usr/local/share/gems/gems/fluent-plugin-kafka-0.13.1/lib/fluent/plugin/kafka_producer_ext.rb:60:in `new'
      /usr/local/share/gems/gems/fluent-plugin-kafka-0.13.1/lib/fluent/plugin/kafka_producer_ext.rb:60:in `topic_producer'
      /usr/local/share/gems/gems/fluent-plugin-kafka-0.13.1/lib/fluent/plugin/out_kafka2.rb:232:in `write'
      /usr/local/share/gems/gems/fluentd-1.7.4/lib/fluent/plugin/output.rb:1125:in `try_flush'
      /usr/local/share/gems/gems/fluentd-1.7.4/lib/fluent/plugin/output.rb:1431:in `flush_thread_run'
      /usr/local/share/gems/gems/fluentd-1.7.4/lib/fluent/plugin/output.rb:461:in `block (2 levels) in start'
      /usr/local/share/gems/gems/fluentd-1.7.4/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
      2020-11-27 12:26:36 +0000 [warn]: failed to flush the buffer. retry_time=6 next_retry_seconds=2020-11-27 12:27:11 +0000 chunk="5b5020a1f38cef2ace97184cd6d81ae9" error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed (unable to get local issuer certificate)"
      ~~~

      [Version-Release number of selected component (if applicable):]

      • OCP 4.6
      • Configure Logging to send logs to Kafka using TLS and chained certificates (Root CA + Intermediate CA)

      [How reproducible]
      Always in customer environment

      Steps to Reproduce:
      1. Deploy 4.6
      2. Configure Logging to send logs to kafka brokers using TLS and using chained certificates (Root CA + Intermediate CA)
      3. All the tests done with curl and openssl indicating the CA + cert + key were giving ok. Then, this is indicating that the CA file and certificates are ok
      4. Check fluentd logs to see the error in the description part

      It seems that a bug exists where the function read_ssl_file doesn't support chained certificates. This bug for the fluent-plugin-kafka is

      Where the description is:

      "This is not mentioned explicitly in README.md, but read_ssl_file function doesn't support chained certificates. In *NIX environment this is very common that *.pem file contains multiple certificates concatenated together (eg. root CA + intermediate CA)."

      And this is the way that we configure the ca-bundle for giving to fluent the CA that it must use. We introduce in the ca-bundle the (Root CA + intermediate CA)

      [Actual results]
      Fluentd is not able to send to Kafka using TLS and chained certificates

      [Expected results]
      Fluentd is able to send to Kafka using TLS and chained certificates

      Attachments

        Issue Links

          Activity

            People

              syedriko_sub@redhat.com Sergey Yedrikov
              syedriko_sub@redhat.com Sergey Yedrikov
              Kabir Bharti Kabir Bharti
              Kabir Bharti Kabir Bharti
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: