Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-3314

[fluentd] The passphrase can not be enabled when forwarding logs to Kafka

    • False
    • None
    • False
    • NEW
    • NEW
    • Hide
      Before this update, users encountered a limitation where they couldn't enable the passphrase for log forwarding to Kafka. This posed a security risk as sensitive information could be exposed.
      With this update, we have resolved the issue and now users can easily enable the passphrase for log forwarding to Kafka. This enhancement ensures secure transmission of log data, protecting it from unauthorized access.

      Show
      Before this update, users encountered a limitation where they couldn't enable the passphrase for log forwarding to Kafka. This posed a security risk as sensitive information could be exposed. With this update, we have resolved the issue and now users can easily enable the passphrase for log forwarding to Kafka. This enhancement ensures secure transmission of log data, protecting it from unauthorized access.
    • Bug Fix
    • Log Collection - Sprint 235, Log Collection - Sprint 236, Log Collection - Sprint 237

      Description of problem:

      The passphrase is not enabled in fluentd.conf when passphrase in forward secret.

      By the way,  There is a known issue already https://github.com/fluent/fluent-plugin-kafka/issues/382

      Version-Release number of selected component (if applicable):

      Logging 5.x

      How reproducible:

      always

      Steps to Reproduce:

      1. deploy kafka with ssl.client.auth=required
        git clone git@gitlab.cee.redhat.com:anli/aosqe-tools.git
        cd logging/log_template/kafka/kafka-2.4.1/
        sh 01_create-pki-cluster-client_passphase.sh
        sh 10_deploy-kafka-plaintext-sasl_ssl.sh
      1. use certificate with passphase to forward logs to kafka
        sh 20_create-clf-kafka-mutual_sasl_ssl_passphase.sh
        #oc create secret generic kafka-fluentd -from-file=ca-bundle.crt=ca/ca_bundle.crt --from-file=tls.crt=client/client.crt  -from-file=tls.key=client/client.key --from-literal=username=${kafka_user_name} --from-literal=password=${kafka_user_password} --from-literal=sasl_over_ssl=true --from-literal=sasl.enable=true --from-literal=sasl.mechanisms=PLAIN --from-literal=passphrase=aosqe2021 -n openshift-logging

      Actual results:

      #fluent.conf
      <label @KAFKA_APP>
        <match **>
          @type kafka2
          @id kafka_app
          brokers kafka.openshift-logging.svc.cluster.local:9093
          default_topic clo-topic
          use_event_time true
          username "#\{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/username') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/username','r') do |f|f.read end : ''}"
          password "#\{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/password') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/password','r') do |f|f.read end : ''}"
          ssl_client_cert_key '/var/run/ocp-collector/secrets/kafka-fluentd/tls.key'
          ssl_client_cert '/var/run/ocp-collector/secrets/kafka-fluentd/tls.crt'
          ssl_ca_cert '/var/run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt'
          sasl_over_ssl true
          <format>
            @type json
           .....
      </label>
      

      Expected results:

      #fluent.conf
      <label @KAFKA_APP>
        <match **>
          @type kafka2
         .....
          ssl_client_cert_key '/var/run/ocp-collector/secrets/kafka-fluentd/tls.key'
          *ssl_client_cert_key_password #\{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/passphase') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/passphase','r') do |f|f.read end : ''}"*
      
             .....
      </label>
      

      Additional info:

       

       

       

       

       

            [LOG-3314] [fluentd] The passphrase can not be enabled when forwarding logs to Kafka

            This issue requires Release Notes Text. Please modify the Release Note Text or set the Release Note Type to "None"

            Jeffrey Cantrill added a comment - This issue requires Release Notes Text. Please modify the Release Note Text or set the Release Note Type to "None"

            Errata Tool added a comment -

            Since the problem described in this issue should be resolved in a recent advisory, it has been closed.

            For information on the advisory (Moderate: Logging Subsystem 5.7.2 - Red Hat OpenShift security update), and where to find the updated files, follow the link below.

            If the solution does not work for you, open a new bug report.
            https://access.redhat.com/errata/RHSA-2023:3495

            Errata Tool added a comment - Since the problem described in this issue should be resolved in a recent advisory, it has been closed. For information on the advisory (Moderate: Logging Subsystem 5.7.2 - Red Hat OpenShift security update), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHSA-2023:3495

            anli@redhat.com do you have any update about this issue?

            Vitalii Parfonov added a comment - anli@redhat.com do you have any update about this issue?

            CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-8_upstream_feef2536018551260fe1e311501d4ff7:

            Updated US source to: 22bd590 Merge pull request #2034 from vparfonov/release-5.7-LOG-3314

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-8_ upstream _feef2536018551260fe1e311501d4ff7 : Updated US source to: 22bd590 Merge pull request #2034 from vparfonov/release-5.7- LOG-3314

            anli@redhat.com We didn't add anything related to scram_mechanism by default, but it't added in your test case: --from-literal=sasl.mechanisms=PLAIN

            Vitalii Parfonov added a comment - anli@redhat.com We didn't add anything related to scram_mechanism by default, but it't added in your test case: --from-literal=sasl.mechanisms=PLAIN

            Anping Li added a comment - - edited

            Why are scram_mechanism "PLAIN" enabled by default? If scram_mechanism isn't set, fluentd works well to send logs to kafka using mechanisms PLAIN.

            Anping Li added a comment - - edited Why are scram_mechanism "PLAIN" enabled by default? If scram_mechanism isn't set, fluentd works well to send logs to kafka using mechanisms PLAIN.

            Vitalii Parfonov added a comment - - edited

            Thanks anli@redhat.com, breaking line fixed here https://github.com/openshift/cluster-logging-operator/pull/2034.
            About PLAIN not supported, need to check
            UPD:
            This is limitation of ruby-kafka https://github.com/zendesk/ruby-kafka/blob/v1.5.0/lib/kafka/sasl/scram.rb#L9.

            MECHANISMS = {
                    "sha256" => "SCRAM-SHA-256",
                    "sha512" => "SCRAM-SHA-512",
                  }
            

            Vitalii Parfonov added a comment - - edited Thanks anli@redhat.com , breaking line fixed here https://github.com/openshift/cluster-logging-operator/pull/2034 . About PLAIN not supported, need to check UPD: This is limitation of ruby-kafka https://github.com/zendesk/ruby-kafka/blob/v1.5.0/lib/kafka/sasl/scram.rb#L9 . MECHANISMS = { "sha256" => "SCRAM-SHA-256" , "sha512" => "SCRAM-SHA-512" , }

            Anping Li added a comment -

            After I break the line, the collector pod raised "SCRAM mechanism PLAIN is not supported" as below.
            After I remove "scram_mechanism "PLAIN", the fluentd works fine.

            $oc logs collector-h5ng5
            Defaulted container "collector" out of: collector, logfilesmetricexporter
            POD_IPS: 10.131.0.80, PROM_BIND_IP: 0.0.0.0
            Setting each total_size_limit for 1 buffers to 20525125632 bytes
            Setting queued_chunks_limit_size for each buffer to 2446
            Setting chunk_limit_size for each buffer to 8388608
            /var/lib/fluentd/pos/journal_pos.json exists, checking if yajl parser able to parse this json file without any error.
            ruby 2.7.6p219 (2022-04-12 revision c9c2245c0a) [x86_64-linux]
            RUBY_GC_HEAP_OLDOBJECT_LIMIT_FACTOR=0.900000 (default value: 2.000000)
            checking if /var/lib/fluentd/pos/journal_pos.json a valid json by calling yajl parser
            2023-06-01 06:08:40 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp
            2023-06-01 06:08:40 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:8: warning: already initialized constant TRANSPORT_CLASS
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:3: warning: previous definition of TRANSPORT_CLASS was here
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:25: warning: already initialized constant SELECTOR_CLASS
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:20: warning: previous definition of SELECTOR_CLASS was here
            2023-06-01 06:08:42 +0000 [error]: unexpected error error_class=Kafka::SaslScramError error="SCRAM mechanism PLAIN is not supported."
              2023-06-01 06:08:42 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/sasl/scram.rb:22:in `block in initialize'
            
            

            Anping Li added a comment - After I break the line, the collector pod raised "SCRAM mechanism PLAIN is not supported" as below. After I remove "scram_mechanism "PLAIN", the fluentd works fine. $oc logs collector-h5ng5 Defaulted container "collector" out of: collector, logfilesmetricexporter POD_IPS: 10.131.0.80, PROM_BIND_IP: 0.0.0.0 Setting each total_size_limit for 1 buffers to 20525125632 bytes Setting queued_chunks_limit_size for each buffer to 2446 Setting chunk_limit_size for each buffer to 8388608 / var /lib/fluentd/pos/journal_pos.json exists, checking if yajl parser able to parse this json file without any error. ruby 2.7.6p219 (2022-04-12 revision c9c2245c0a) [x86_64-linux] RUBY_GC_HEAP_OLDOBJECT_LIMIT_FACTOR=0.900000 ( default value: 2.000000) checking if / var /lib/fluentd/pos/journal_pos.json a valid json by calling yajl parser 2023-06-01 06:08:40 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp 2023-06-01 06:08:40 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:8: warning: already initialized constant TRANSPORT_CLASS /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:3: warning: previous definition of TRANSPORT_CLASS was here /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:25: warning: already initialized constant SELECTOR_CLASS /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:20: warning: previous definition of SELECTOR_CLASS was here 2023-06-01 06:08:42 +0000 [error]: unexpected error error_class=Kafka::SaslScramError error= "SCRAM mechanism PLAIN is not supported." 2023-06-01 06:08:42 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/sasl/scram.rb:22:in `block in initialize'

            Anping Li added a comment - - edited

            ssl_client_cert_key_password and scram_mechanism are written in one line.

              <match **>
                @type kafka2
                @id kafka_app
                brokers kafka.openshift-logging.svc.cluster.local:9093
                default_topic clo-topic
                use_event_time true
                username "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/username') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/username','r') do |f|f.read end : ''}"
                password "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/password') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/password','r') do |f|f.read end : ''}"
                ssl_client_cert_key '/var/run/ocp-collector/secrets/kafka-fluentd/tls.key'
                ssl_client_cert '/var/run/ocp-collector/secrets/kafka-fluentd/tls.crt'
                ssl_ca_cert '/var/run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt'
                sasl_over_ssl true
                ssl_client_cert_key_password "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase','r') do |f|f.read end : ''}"scram_mechanism "PLAIN"
                  ...
                 </match>
            
             oc logs collector-6bgq4
            Defaulted container "collector" out of: collector, logfilesmetricexporter
            POD_IPS: 10.128.2.78, PROM_BIND_IP: 0.0.0.0
            Setting each total_size_limit for 1 buffers to 20525125632 bytes
            Setting queued_chunks_limit_size for each buffer to 2446
            Setting chunk_limit_size for each buffer to 8388608
            /var/lib/fluentd/pos/journal_pos.json exists, checking if yajl parser able to parse this json file without any error.
            ruby 2.7.6p219 (2022-04-12 revision c9c2245c0a) [x86_64-linux]
            RUBY_GC_HEAP_OLDOBJECT_LIMIT_FACTOR=0.900000 (default value: 2.000000)
            checking if /var/lib/fluentd/pos/journal_pos.json a valid json by calling yajl parser
            /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/basic_parser.rb:92:in `parse_error!': expected end of line at fluent.conf line 347,201 (Fluent::ConfigParseError)
            346:     sasl_over_ssl true
            347:     ssl_client_cert_key_password "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase','r') do |f|f.read end : ''}"scram_mechanism "PLAIN"
            
                 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------^
            348:     <format>
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:133:in `parse_element'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:44:in `parse!'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:33:in `parse'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:58:in `parse'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:39:in `build'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:618:in `initialize'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in `new'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in `<top (required)>'
            	from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'
            	from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'
            	from /usr/local/share/gems/gems/fluentd-1.14.6/bin/fluentd:15:in `<top (required)>'
            	from /usr/local/bin/fluentd:23:in `load'
            	from /usr/local/bin/fluentd:23:in `<main>'
            [anli@preserve-docker-slave kafka-2.4.1]$ 
            
            

            Anping Li added a comment - - edited ssl_client_cert_key_password and scram_mechanism are written in one line. <match **> @type kafka2 @id kafka_app brokers kafka.openshift-logging.svc.cluster.local:9093 default_topic clo-topic use_event_time true username "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/username' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/username' , 'r' ) do |f|f.read end : ''}" password "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/password' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/password' , 'r' ) do |f|f.read end : ''}" ssl_client_cert_key '/ var /run/ocp-collector/secrets/kafka-fluentd/tls.key' ssl_client_cert '/ var /run/ocp-collector/secrets/kafka-fluentd/tls.crt' ssl_ca_cert '/ var /run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt' sasl_over_ssl true ssl_client_cert_key_password "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' , 'r' ) do |f|f.read end : ''}" scram_mechanism "PLAIN" ... </match> oc logs collector-6bgq4 Defaulted container "collector" out of: collector, logfilesmetricexporter POD_IPS: 10.128.2.78, PROM_BIND_IP: 0.0.0.0 Setting each total_size_limit for 1 buffers to 20525125632 bytes Setting queued_chunks_limit_size for each buffer to 2446 Setting chunk_limit_size for each buffer to 8388608 / var /lib/fluentd/pos/journal_pos.json exists, checking if yajl parser able to parse this json file without any error. ruby 2.7.6p219 (2022-04-12 revision c9c2245c0a) [x86_64-linux] RUBY_GC_HEAP_OLDOBJECT_LIMIT_FACTOR=0.900000 ( default value: 2.000000) checking if / var /lib/fluentd/pos/journal_pos.json a valid json by calling yajl parser /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/basic_parser.rb:92:in `parse_error!': expected end of line at fluent.conf line 347,201 (Fluent::ConfigParseError) 346: sasl_over_ssl true 347: ssl_client_cert_key_password "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' , 'r' ) do |f|f.read end : ''}" scram_mechanism "PLAIN" ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------^ 348: <format> from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:133:in `parse_element' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:44:in `parse!' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:33:in `parse' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:58:in `parse' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:39:in `build' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:618:in `initialize' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in ` new ' from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in `<top (required)>' from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require' from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require' from /usr/local/share/gems/gems/fluentd-1.14.6/bin/fluentd:15:in `<top (required)>' from /usr/local/bin/fluentd:23:in `load' from /usr/local/bin/fluentd:23:in `<main>' [anli@preserve-docker-slave kafka-2.4.1]$

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-8_ upstream _5992592a94f0861ef6c7dae08b09b5d5 : Updated 2 upstream sources

            CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-8_upstream_de608460c881c6db58116edd7ae02c06:

            Updated US source to: d2cb7b0 Merge pull request #2018 from vparfonov/release-5.7-log3314

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-8_ upstream _de608460c881c6db58116edd7ae02c06 : Updated US source to: d2cb7b0 Merge pull request #2018 from vparfonov/release-5.7-log3314

            Anping Li added a comment -

            vparfono It seems the PR https://github.com/openshift/cluster-logging-operator/pull/2007 isn't cherriy-pick to release-5.7.

            Anping Li added a comment - vparfono It seems the PR https://github.com/openshift/cluster-logging-operator/pull/2007 isn't cherriy-pick to release-5.7.

            Anping Li added a comment - - edited

            fluent-plugin-kafka (0.19.0) is now. But the collector pods still raise error below. the passphrase isn't written into fluent.conf

              <match **>
                @type kafka2
                @id kafka_app
                brokers kafka.kafka-aosqe.svc.cluster.local:9093
                default_topic clo-topic
                use_event_time true
                username "#{File.exists?('/var/run/ocp-collector/secrets/to-kafka-secret/username') ? open('/var/run/ocp-collector/secrets/to-kafka-secret/username','r') do |f|f.read end : ''}"
                password "#{File.exists?('/var/run/ocp-collector/secrets/to-kafka-secret/password') ? open('/var/run/ocp-collector/secrets/to-kafka-secret/password','r') do |f|f.read end : ''}"
                ssl_client_cert_key '/var/run/ocp-collector/secrets/to-kafka-secret/tls.key'
                ssl_client_cert '/var/run/ocp-collector/secrets/to-kafka-secret/tls.crt'
                ssl_ca_cert '/var/run/ocp-collector/secrets/to-kafka-secret/ca-bundle.crt'
                sasl_over_ssl true
                <format>
                  @type json
                </format>
             ....
             </match>
            
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:20: warning: previous definition of SELECTOR_CLASS was here
            Enter PEM pass phrase:
            2023-05-29 14:43:16 +0000 [error]: unexpected error error_class=OpenSSL::PKey::PKeyError error="Could not parse PKey: no start line"
              2023-05-29 14:43:16 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/ssl_context.rb:18:in `read'
            
            

            Anping Li added a comment - - edited fluent-plugin-kafka (0.19.0) is now. But the collector pods still raise error below. the passphrase isn't written into fluent.conf <match **> @type kafka2 @id kafka_app brokers kafka.kafka-aosqe.svc.cluster.local:9093 default_topic clo-topic use_event_time true username "#{File.exists?( '/ var /run/ocp-collector/secrets/to-kafka-secret/username' ) ? open( '/ var /run/ocp-collector/secrets/to-kafka-secret/username' , 'r' ) do |f|f.read end : ''}" password "#{File.exists?( '/ var /run/ocp-collector/secrets/to-kafka-secret/password' ) ? open( '/ var /run/ocp-collector/secrets/to-kafka-secret/password' , 'r' ) do |f|f.read end : ''}" ssl_client_cert_key '/ var /run/ocp-collector/secrets/to-kafka-secret/tls.key' ssl_client_cert '/ var /run/ocp-collector/secrets/to-kafka-secret/tls.crt' ssl_ca_cert '/ var /run/ocp-collector/secrets/to-kafka-secret/ca-bundle.crt' sasl_over_ssl true <format> @type json </format> .... </match> /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:20: warning: previous definition of SELECTOR_CLASS was here Enter PEM pass phrase: 2023-05-29 14:43:16 +0000 [error]: unexpected error error_class=OpenSSL::PKey::PKeyError error= "Could not parse PKey: no start line" 2023-05-29 14:43:16 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/ssl_context.rb:18:in `read'

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-8_ upstream _0326a8d36ae26f5143e9d9d7860705ad : Updated 2 upstream sources

            Thanks a lot rhn-support-ikanse

            Vitalii Parfonov added a comment - Thanks a lot rhn-support-ikanse

            Ishwar Kanse added a comment - - edited

            Yeah, the Fluetnd Kafka plugin in our Fluentd image is at .

            h-4.4# gem list | grep -i kafka
            fluent-plugin-kafka (0.17.5)
            ruby-kafka (1.4.0)
            sh-4.4# find / -name fluent-plugin-kafka
            /usr/local/share/gems/cache/fluent-plugin-kafka-0.17.5.gem
            /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5
            /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/fluent-plugin-kafka.gemspec
            /usr/local/share/gems/specifications/fluent-plugin-kafka-0.17.5.gemspec

            That version doesn't have the parameter ssl_client_cert_key_password  https://github.com/fluent/fluent-plugin-kafka/blob/v0.17.5/lib/fluent/plugin/in_kafka_group.rb

            Its available in https://github.com/fluent/fluent-plugin-kafka/blob/v0.19.0/lib/fluent/plugin/in_kafka_group.rb#L189 

            Ishwar Kanse added a comment - - edited Yeah, the Fluetnd Kafka plugin in our Fluentd image is at . h-4.4# gem list | grep -i kafka fluent-plugin-kafka (0.17.5) ruby-kafka (1.4.0) sh-4.4# find / -name fluent-plugin-kafka /usr/local/share/gems/cache/fluent-plugin-kafka-0.17.5.gem /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5 /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/fluent-plugin-kafka.gemspec /usr/local/share/gems/specifications/fluent-plugin-kafka-0.17.5.gemspec That version doesn't have the parameter ssl_client_cert_key_password  https://github.com/fluent/fluent-plugin-kafka/blob/v0.17.5/lib/fluent/plugin/in_kafka_group.rb Its available in https://github.com/fluent/fluent-plugin-kafka/blob/v0.19.0/lib/fluent/plugin/in_kafka_group.rb#L189  

            Ishwar Kanse added a comment - - edited

            Hi vparfono with the new PR, we still run into the issue with ssl_client_cert_key_password parameter. 

            $ oc logs collector-wfnrv
            Defaulted container "collector" out of: collector, logfilesmetricexporter
            POD_IPS: 10.130.0.148, PROM_BIND_IP: 0.0.0.0
            Setting each total_size_limit for 1 buffers to 19236595507 bytes
            Setting queued_chunks_limit_size for each buffer to 2293
            Setting chunk_limit_size for each buffer to 8388608
            2023-05-17 11:09:23 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp
            2023-05-17 11:09:23 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:8: warning: already initialized constant TRANSPORT_CLASS
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:3: warning: previous definition of TRANSPORT_CLASS was here
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:25: warning: already initialized constant SELECTOR_CLASS
            /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:20: warning: previous definition of SELECTOR_CLASS was here
            2023-05-17 11:09:24 +0000 [warn]: parameter 'ssl_client_cert_key_password' in <match **>
              @type kafka2
              @id kafka_app
              brokers kafka.openshift-logging.svc.cluster.local:9093
              default_topic "clo-topic"
              use_event_time true
              username "admin"
              password xxxxxx
              ssl_client_cert_key "/var/run/ocp-collector/secrets/kafka-fluentd/tls.key"
              ssl_client_cert "/var/run/ocp-collector/secrets/kafka-fluentd/tls.crt"
              ssl_ca_cert /var/run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt
              sasl_over_ssl true
              ssl_client_cert_key_password aosqe2021
              scram_mechanism "PLAIN"
              <format>
                @type "json"
              </format>
              <buffer _clo-topic>
                @type "file"
                path "/var/lib/fluentd/kafka_app"
                flush_mode interval
                flush_interval 1s
                flush_thread_count 2
                retry_type exponential_backoff
                retry_wait 1s
                retry_max_interval 60s
                retry_timeout 60m
                queued_chunks_limit_size 2293
                total_limit_size 19236595507
                chunk_limit_size 8388608
                overflow_action block
                disable_chunk_backup true
              </buffer>
            </match> is not used.
            Enter PEM pass phrase:
            2023-05-17 11:09:24 +0000 [error]: unexpected error error_class=OpenSSL::PKey::PKeyError error="Could not parse PKey: no start line"
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka/ssl_context.rb:18:in `read'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka/ssl_context.rb:18:in `build'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka/client.rb:95:in `initialize'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka.rb:366:in `new'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka.rb:366:in `new'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/out_kafka2.rb:105:in `refresh_client'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/out_kafka2.rb:196:in `start'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:203:in `block in start'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:182:in `block (2 levels) in lifecycle'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:121:in `block (2 levels) in lifecycle'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:120:in `each'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:120:in `block in lifecycle'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:113:in `each'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:113:in `lifecycle'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:181:in `block in lifecycle'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:178:in `each'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:178:in `lifecycle'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:202:in `start'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/engine.rb:248:in `start'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/engine.rb:147:in `run'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:720:in `block in run_worker'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:971:in `main_process'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:711:in `run_worker'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:376:in `<top (required)>'
              2023-05-17 11:09:24 +0000 [error]: /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'
              2023-05-17 11:09:24 +0000 [error]: /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/bin/fluentd:15:in `<top (required)>'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/bin/fluentd:23:in `load'
              2023-05-17 11:09:24 +0000 [error]: /usr/local/bin/fluentd:23:in `<main>'
            2023-05-17 11:09:24 +0000 [error]: unexpected error error_class=OpenSSL::PKey::PKeyError error="Could not parse PKey: no start line"
              2023-05-17 11:09:24 +0000 [error]: suppressed same stacktrace

            Looks like the Kafka plugin doesn't support the ssl_client_cert_key_password parameter yet in the Fluentd Kafka plugin version in our Fluentd image, the option is available upstream.  https://github.com/fluent/fluent-plugin-kafka/blob/master/lib/fluent/plugin/in_kafka_group.rb#L195-L200 

            Ishwar Kanse added a comment - - edited Hi vparfono with the new PR, we still run into the issue with ssl_client_cert_key_password parameter.  $ oc logs collector-wfnrv Defaulted container "collector" out of: collector, logfilesmetricexporter POD_IPS: 10.130.0.148, PROM_BIND_IP: 0.0.0.0 Setting each total_size_limit for 1 buffers to 19236595507 bytes Setting queued_chunks_limit_size for each buffer to 2293 Setting chunk_limit_size for each buffer to 8388608 2023-05-17 11:09:23 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp 2023-05-17 11:09:23 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:8: warning: already initialized constant TRANSPORT_CLASS /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:3: warning: previous definition of TRANSPORT_CLASS was here /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:25: warning: already initialized constant SELECTOR_CLASS /usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:20: warning: previous definition of SELECTOR_CLASS was here 2023-05-17 11:09:24 +0000 [warn]: parameter 'ssl_client_cert_key_password' in <match **>   @type kafka2   @id kafka_app   brokers kafka.openshift-logging.svc.cluster.local:9093   default_topic "clo-topic"   use_event_time true   username "admin"   password xxxxxx   ssl_client_cert_key "/ var /run/ocp-collector/secrets/kafka-fluentd/tls.key"   ssl_client_cert "/ var /run/ocp-collector/secrets/kafka-fluentd/tls.crt"   ssl_ca_cert / var /run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt   sasl_over_ssl true   ssl_client_cert_key_password aosqe2021   scram_mechanism "PLAIN"   <format>     @type "json"   </format>   <buffer _clo-topic>     @type "file"     path "/ var /lib/fluentd/kafka_app"     flush_mode interval     flush_interval 1s     flush_thread_count 2     retry_type exponential_backoff     retry_wait 1s     retry_max_interval 60s     retry_timeout 60m     queued_chunks_limit_size 2293     total_limit_size 19236595507     chunk_limit_size 8388608     overflow_action block     disable_chunk_backup true   </buffer> </match> is not used. Enter PEM pass phrase: 2023-05-17 11:09:24 +0000 [error]: unexpected error error_class=OpenSSL::PKey::PKeyError error= "Could not parse PKey: no start line"   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka/ssl_context.rb:18:in `read'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka/ssl_context.rb:18:in `build'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka/client.rb:95:in `initialize'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka.rb:366:in ` new '   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka.rb:366:in ` new '   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/out_kafka2.rb:105:in `refresh_client'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/out_kafka2.rb:196:in `start'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:203:in `block in start'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:182:in `block (2 levels) in lifecycle'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:121:in `block (2 levels) in lifecycle'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:120:in `each'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:120:in `block in lifecycle'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:113:in `each'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:113:in `lifecycle'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:181:in `block in lifecycle'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:178:in `each'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:178:in `lifecycle'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:202:in `start'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/engine.rb:248:in `start'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/engine.rb:147:in `run'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:720:in `block in run_worker'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:971:in `main_process'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:711:in `run_worker'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:376:in `<top (required)>'   2023-05-17 11:09:24 +0000 [error]: /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'   2023-05-17 11:09:24 +0000 [error]: /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'   2023-05-17 11:09:24 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/bin/fluentd:15:in `<top (required)>'   2023-05-17 11:09:24 +0000 [error]: /usr/local/bin/fluentd:23:in `load'   2023-05-17 11:09:24 +0000 [error]: /usr/local/bin/fluentd:23:in `<main>' 2023-05-17 11:09:24 +0000 [error]: unexpected error error_class=OpenSSL::PKey::PKeyError error= "Could not parse PKey: no start line" 2023-05-17 11:09:24 +0000 [error]: suppressed same stacktrace Looks like the Kafka plugin doesn't support the ssl_client_cert_key_password parameter yet in the Fluentd Kafka plugin version in our Fluentd image, the option is available upstream.  https://github.com/fluent/fluent-plugin-kafka/blob/master/lib/fluent/plugin/in_kafka_group.rb#L195-L200  

            CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.8-rhel-9_upstream_e2e889fc9a30919fe7840d988f8b8b51:

            Updated US source to: d93c19b Merge pull request #2007 from vparfonov/log3314-2

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.8-rhel-9_ upstream _e2e889fc9a30919fe7840d988f8b8b51 : Updated US source to: d93c19b Merge pull request #2007 from vparfonov/log3314-2

            rhn-support-ikanse Thanks for checking issue fixing in master branch with PR https://github.com/openshift/cluster-logging-operator/pull/2007 branch release 5.7 in code freeze 

            Vitalii Parfonov added a comment - rhn-support-ikanse Thanks for checking issue fixing in master branch with PR https://github.com/openshift/cluster-logging-operator/pull/2007 branch release 5.7 in code freeze 

            vparfono Built CLO image using the PR quay.io/rhn_support_ikanse/cluster-logging-operator:pr1963 

            There is config issue in the Fluentd conf that is generated. Collector logs are in CrashLoopBackOff

             

            $ oc logs collector-z54pk
            Defaulted container "collector" out of: collector, logfilesmetricexporter
            POD_IPS: 10.130.0.137, PROM_BIND_IP: 0.0.0.0
            Setting each total_size_limit for 1 buffers to 20533579161 bytes
            Setting queued_chunks_limit_size for each buffer to 2447
            Setting chunk_limit_size for each buffer to 8388608
            /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/basic_parser.rb:92:in `parse_error!': expected end of line at fluent.conf line 347,201 (Fluent::ConfigParseError)
            346:     sasl_over_ssl true
            347:     ssl_client_cert_key_password "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase','r') do |f|f.read end : ''}"scram_mechanism "PLAIN"
                 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------^
            348:     <format>
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:133:in `parse_element'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:44:in `parse!'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:33:in `parse'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:58:in `parse'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:39:in `build'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:618:in `initialize'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in `new'
                from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in `<top (required)>'
                from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'
                from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'
                from /usr/local/share/gems/gems/fluentd-1.14.6/bin/fluentd:15:in `<top (required)>'
                from /usr/local/bin/fluentd:23:in `load'
                from /usr/local/bin/fluentd:23:in `<main>'
            

            The Fluentd config that is generated, which gives the error. 

              <match **>
                @type kafka2
                @id kafka_app
                brokers kafka.openshift-logging.svc.cluster.local:9093
                default_topic clo-topic
                use_event_time true
                username "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/username') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/username','r') do |f|f.read end : ''}"
                password "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/password') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/password','r') do |f|f.read end : ''}"
                ssl_client_cert_key '/var/run/ocp-collector/secrets/kafka-fluentd/tls.key'
                ssl_client_cert '/var/run/ocp-collector/secrets/kafka-fluentd/tls.crt'
                ssl_ca_cert '/var/run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt'
                sasl_over_ssl true
                ssl_client_cert_key_password "#{File.exists?('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase') ? open('/var/run/ocp-collector/secrets/kafka-fluentd/passphrase','r') do |f|f.read end : ''}"scram_mechanism "PLAIN"
                <format>
                  @type json
                </format>
                <buffer clo-topic>
                  @type file
                  path '/var/lib/fluentd/kafka_app'
                  flush_mode interval
                  flush_interval 1s
                  flush_thread_count 2
                  retry_type exponential_backoff
                  retry_wait 1s
                  retry_max_interval 60s
                  retry_timeout 60m
                  queued_chunks_limit_size "#{ENV['BUFFER_QUEUE_LIMIT'] || '32'}"
                  total_limit_size "#{ENV['TOTAL_LIMIT_SIZE_PER_BUFFER'] || '8589934592'}"
                  chunk_limit_size "#{ENV['BUFFER_SIZE_LIMIT'] || '8m'}"
                  overflow_action block
                  disable_chunk_backup true
                </buffer>
              </match>
            </label>

             

            Ishwar Kanse added a comment - vparfono Built CLO image using the PR quay.io/rhn_support_ikanse/cluster-logging-operator:pr1963  There is config issue in the Fluentd conf that is generated. Collector logs are in CrashLoopBackOff   $ oc logs collector-z54pk Defaulted container "collector" out of: collector, logfilesmetricexporter POD_IPS: 10.130.0.137, PROM_BIND_IP: 0.0.0.0 Setting each total_size_limit for 1 buffers to 20533579161 bytes Setting queued_chunks_limit_size for each buffer to 2447 Setting chunk_limit_size for each buffer to 8388608 /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/basic_parser.rb:92:in `parse_error!': expected end of line at fluent.conf line 347,201 (Fluent::ConfigParseError) 346:     sasl_over_ssl true 347:     ssl_client_cert_key_password "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' , 'r' ) do |f|f.read end : ''}" scram_mechanism "PLAIN"      ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------^ 348:     <format>     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:133:in `parse_element'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:96:in `parse_element'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:44:in `parse!'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config/v1_parser.rb:33:in `parse'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:58:in `parse'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/config.rb:39:in `build'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:618:in `initialize'     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in ` new '     from /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:362:in `<top (required)>'     from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'     from /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'     from /usr/local/share/gems/gems/fluentd-1.14.6/bin/fluentd:15:in `<top (required)>'     from /usr/local/bin/fluentd:23:in `load'     from /usr/local/bin/fluentd:23:in `<main>' The Fluentd config that is generated, which gives the error.    <match **>     @type kafka2     @id kafka_app     brokers kafka.openshift-logging.svc.cluster.local:9093     default_topic clo-topic     use_event_time true     username "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/username' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/username' , 'r' ) do |f|f.read end : ''}"     password "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/password' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/password' , 'r' ) do |f|f.read end : ''}"     ssl_client_cert_key '/ var /run/ocp-collector/secrets/kafka-fluentd/tls.key'     ssl_client_cert '/ var /run/ocp-collector/secrets/kafka-fluentd/tls.crt'     ssl_ca_cert '/ var /run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt'     sasl_over_ssl true     ssl_client_cert_key_password "#{File.exists?( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' ) ? open( '/ var /run/ocp-collector/secrets/kafka-fluentd/passphrase' , 'r' ) do |f|f.read end : ''}" scram_mechanism "PLAIN"     <format>       @type json     </format>     <buffer clo-topic>       @type file       path '/ var /lib/fluentd/kafka_app'       flush_mode interval       flush_interval 1s       flush_thread_count 2       retry_type exponential_backoff       retry_wait 1s       retry_max_interval 60s       retry_timeout 60m       queued_chunks_limit_size "#{ENV[ 'BUFFER_QUEUE_LIMIT' ] || '32' }"       total_limit_size "#{ENV[ 'TOTAL_LIMIT_SIZE_PER_BUFFER' ] || '8589934592' }"       chunk_limit_size "#{ENV[ 'BUFFER_SIZE_LIMIT' ] || '8m' }"       overflow_action block       disable_chunk_backup true     </buffer>   </match> </label>  

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.8-rhel-9_ upstream _d420e22935d4ba836268330792f59bc4 : Updated 3 upstream sources

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.8-rhel-9_ upstream _d420e22935d4ba836268330792f59bc4 : Updated 3 upstream sources

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-9_ upstream _d420e22935d4ba836268330792f59bc4 : Updated 4 upstream sources

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.8-rhel-9_ upstream _a7cd2b215b81219a515da362f7f2b037 : Updated 2 upstream sources

            GitLab CEE Bot added a comment - CPaaS Service Account mentioned this issue in a merge request of openshift-logging / Log Collection Midstream on branch openshift-logging-5.7-rhel-9_ upstream _a7cd2b215b81219a515da362f7f2b037 : Updated 3 upstream sources

            We do not pass a file for the password either. There is a simple ruby script that loads it on the fly like https://github.com/openshift/cluster-logging-operator/blob/master/internal/generator/fluentd/output/elasticsearch/userpass.go#L16

            Jeffrey Cantrill added a comment - We do not pass a file for the password either. There is a simple ruby script that loads it on the fly like https://github.com/openshift/cluster-logging-operator/blob/master/internal/generator/fluentd/output/elasticsearch/userpass.go#L16

            vparfono the proposed solution looks to be correct.  The passphrase is added to the secret and we load it from file so as not to expose the secret in the configmap.  This is the same logic for retrieving username and password which are also strings normally added directly to the configuration.

            Jeffrey Cantrill added a comment - vparfono the proposed solution looks to be correct.  The passphrase is added to the secret and we load it from file so as not to expose the secret in the configmap.  This is the same logic for retrieving username and password which are also strings normally added directly to the configuration.

            jcantril@redhat.com Can you point me is proposed configuration in issue is correct? I'm asking because according to the code fluent/plugin/out_kafka2.rb#L150 in SHOULD be NOT file but simple string with passphrase.
            Thanks

            Vitalii Parfonov added a comment - jcantril@redhat.com Can you point me is proposed configuration in issue is correct? I'm asking because according to the code fluent/plugin/out_kafka2.rb#L150 in SHOULD be NOT file but simple string with passphrase. Thanks

              vparfono Vitalii Parfonov
              rhn-support-anli Anping Li
              Anping Li Anping Li
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

                Created:
                Updated:
                Resolved: