Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-1916

[release-5.2] Fluentd logs emit transaction failed: error_class=NoMethodError while forwarding to external syslog server

    XMLWordPrintable

Details

    • False
    • False
    • NEW
    • NEW
    • Avoid error if object hasn't strip() method, checking method before call it. Possible situation if we will got number instead of string e.g.
    • Logging (Core) - Sprint 209

    Description

      Description of problem:
      when forwarding logs to external syslog using clusterlogforwarder, seeing the emit transaction failed error in fluentd pods

      Version-Release number of selected component (if applicable):
      Cluster version is 4.6
      cluster-logging - 4.6

      Errors messages we see in the pod

      ~~~

      2021-06-25T09:11:53.889199910+09:00 2021-06-25 09:11:53 +0900 [warn]: emit transaction failed: error_class=NoMethodError error="undefined method `strip' for #<Hash:0x00007f6080042d00>" location="/etc/fluent/plugin/filter_parse_json_field.rb:83:in `block in do_replace_json_log'" tag="kubernetes.var.log.containers.analytics-client-5b6df86bc6-rd4fr_apic_client-922f752cd575c1b518866672111523acdf61f55447238e0aaf74e880a365e611.log"
      2021-06-25T09:11:53.889232126+09:00 2021-06-25 09:11:53 +0900 [warn]: suppressed same stacktrace

      ~~~

       

      We checked the error and we can see the file /etc/fluent/plugin/filter_parse_json_field.rb in fluentd pod.

       

      Tried to understand the file it looks like the error is basically pointing to the function "do_replace_json_log" from the file. Can you please take a look on the code and help us to understand that why the customer is getting this error.

       

      FYI:- Below code is from the internal cluster.

      ~~~
      module Fluent::Plugin
        class ParseJSONFieldFilter < Fluent::Plugin::Filter    Fluent::Plugin.register_filter('parse_json_field', self)    config_param :merge_json_log, :bool, default: true
          config_param :preserve_json_log, :bool, default: true
          config_param :replace_json_log, :bool, default: false
          config_param :json_fields, :array, default: ['MESSAGE', 'log']    def initialize
            super
          end    def configure(conf)
            super
          end    def filter_stream(tag, es)
            return es unless @merge_json_log || @replace_json_log
            new_es = Fluent::MultiEventStream.new
            if @merge_json_log
              es.each

      { |time, record|           record = do_merge_json_log(record)           new_es.add(time, record)         }

            elsif @replace_json_log
              es.each

      { |time, record|           record = do_replace_json_log(record)           new_es.add(time, record)         }

            end
            new_es    end    def do_merge_json_log(record)
            json_fields.each do |merge_json_log_key|
              if record.has_key?(merge_json_log_key)
                value = (record[merge_json_log_key] || "").strip
                if value.start_with?('{') && value.end_with?('}')
                  begin
                    record = JSON.parse(value).merge(record)
                    unless @preserve_json_log
                      record.delete(merge_json_log_key)
                    end
                  rescue JSON::ParserError
                    log.debug "parse_json_field could not parse field {merge_json_log_key} as JSON: value {value}"
                  end
                end
                break
              end
            end
            record
          end    def do_replace_json_log(record)
            json_fields.each do |merge_json_log_key|
              if record.has_key?(merge_json_log_key)
                value = (record[merge_json_log_key] || "").strip
                if value.start_with?('{') && value.end_with?('}')
                  begin
                    parsed_value = JSON.parse(value)
                    record[merge_json_log_key] = parsed_value
                  rescue JSON::ParserError
                    log.debug "parse_json_field could not parse field {merge_json_log_key} as JSON: value {value}"
                  end
                end
                break
              end
            end
            record
          end  end
      end
      ~~~

      Cluster log forwarding CRD

      ~~~

      spec:
      outputs:

      • name: rsyslog-X
        syslog:
        appName: ocp
        facility: local0
        procID: fluentd
        rfc: RFC5424
        severity: informational
        type: syslog
        url: tcp://IP:PORT
      • name: rsyslog-X
        syslog:
        appName: ocp
        facility: local0
        procID: fluentd
        rfc: RFC5424
        severity: informational
        type: syslog
        url: tcp://IP:PORT
      • name: remote-fluentd-X
        type: fluentdForward
        url: tcp://IP:PORT
      • name: remote-fluentd-X
        type: fluentdForward
        url: tcp://IP:PORT
        pipelines:
      • inputRefs:
      • application
        labels:
        syslog: ocp
        name: syslog-X
        outputRefs:
      • rsyslog-X
      • rsyslog-X
      • inputRefs:
      • audit
        name: td-agent-audit
        outputRefs:
      • remote-fluentd-X
      • remote-fluentd-X

      ~~~

      Please let us know in case of any additional information required.

       

      Attachments

        Issue Links

          Activity

            People

              vparfono Vitalii Parfonov
              rhn-support-aharchin Akhil Harchinder (Inactive)
              Ishwar Kanse Ishwar Kanse
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: