Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-4200

[vector] the collector pod CrashLoopBackOff if kafka output is defined and don't use in pipeline

XMLWordPrintable

    • False
    • None
    • False
    • NEW
    • NEW
    • Bug Fix

      Description of problem:

      oc get pods
      NAME READY STATUS RESTARTS AGE
      cluster-logging-operator-758d6c95f6-sljc7 1/1 Running 0 4h39m
      collector-24lpm 1/2 Error 2 (24s ago) 29s
      collector-676wd 1/2 CrashLoopBackOff 2 (17s ago) 34s

      $oc logs collector-24lpm
      Defaulted container "collector" out of: collector, logfilesmetricexporter
      2023-06-05T08:20:52.845682Z ERROR vector::cli: Configuration error. error=Transform "kafka_app_dedot" has no inputs

      How reproducible:

      Always

      Steps to Reproduce:

      1. define a kafka output, but don't use it in Pipelines.
      apiVersion: logging.openshift.io/v1
      kind: ClusterLogForwarder
      metadata:
        name: instance
        namespace: openshift-logging
      spec:
        outputs:
        - name: kafka-app
          secret:
            name: kafka-fluentd
          type: kafka
          url: tls://kafka.openshift-logging.svc.cluster.local:9093/clo-topic
        pipelines:
        - inputRefs:
          - audit
          - infrastructure
          - application
          name: test-app
          outputRefs:
          - default
      

      Actual results:

      oc logs collector-24lpm
      Defaulted container "collector" out of: collector, logfilesmetricexporter
      2023-06-05T08:20:52.845682Z ERROR vector::cli: Configuration error. error=Transform "kafka_app_dedot" has no inputs

      Expected results:

      The unused output or input are ignore when generate vector.toml

      Additional info:

              Unassigned Unassigned
              rhn-support-anli Anping Li
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: