Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-2167

[Vector] Collector pods fails to start with configuration error when using Kafka SASL over SSL

    XMLWordPrintable

Details

    • Logging (Core) - Sprint 219, Logging (Core) - Sprint 220

    Description

      Logging 5.5 tech-preview/Core

      Steps to reproduce the issue:

      1 Deploy Kafka and enable SASL SSL

      2 Create ClusterLogForwarder instance to forward logs to Kafka.

      cat 20_create-clf-kafka-sasl_ssl.sh
      kafka_namespace=${KAFKA_NAMESPACE:-openshift-logging}
      kafka_user_name="admin"
      kafka_user_password="admin-secret"oc delete clf instance -n openshift-logging
      oc delete secret kafka-fluentd -n openshift-logging
      oc create secret generic kafka-fluentd --from-file=ca-bundle.crt=ca/ca_bundle.crt --from-literal=username=${kafka_user_name} --from-literal=password=${kafka_user_password} --from-literal=sasl_over_ssl=true -n openshift-loggingcat <<EOF |oc create -f -
      apiVersion: logging.openshift.io/v1
      kind: ClusterLogForwarder
      metadata:
        name: instance
        namespace: openshift-logging
      spec:
        outputs:
          - name: kafka-app
            url: tls://kafka.${kafka_namespace}.svc.cluster.local:9093/clo-topic
            type: kafka
            secret:
              name: kafka-fluentd
        pipelines:
          - name: test-app
            inputRefs:
            - application
            outputRefs:
            - kafka-app
      EOF
       

      3 Deploy ClusterLogging instance with Vector as collector.

      4 Check the collector pod status and logs.

      collector-jnrcs                                 1/2     CrashLoopBackOff   5 (32s ago)     3m37s
       
      oc logs collector-jnrcs -c collector
      Jan 25 08:35:24.227  INFO vector::app: Log level is enabled. level="debug"
      Jan 25 08:35:24.227  INFO vector::sources::host_metrics: PROCFS_ROOT is unset. Using default '/proc' for procfs root.
      Jan 25 08:35:24.227  INFO vector::sources::host_metrics: SYSFS_ROOT is unset. Using default '/sys' for sysfs root.
      Jan 25 08:35:24.228  INFO vector::app: Loading configs. path=[("/etc/vector/vector.toml", Some(Toml))]
      Jan 25 08:35:24.232  INFO vector::sources::kubernetes_logs: Obtained Kubernetes Node name to collect logs for (self). self_node_name="ikansek-qdfkx-master-0.c.openshift-qe.internal"
      Jan 25 08:35:24.246 ERROR vector::topology: Configuration error. error=Sink "kafka_app": creating kafka producer failed: Client creation error: Invalid sasl.kerberos.kinit.cmd value: Property not available: "sasl.kerberos.keytab" 

      Vector config generated for Kafka SASL over SSL.

      # Rename log stream to "application"
      [transforms.application]
      type = "remap"
      inputs = ["route_container_logs.app"]
      source = """
      .log_type = "application"
      """
      
      
      
      [transforms.test-app]
      type = "remap"
      inputs = ["application"]
      source = """
      .
      """
       
      # Kafka config
      [sinks.kafka_app]
      type = "kafka"
      inputs = ["test-app"]
      bootstrap_servers = "kafka.openshift-logging.svc.cluster.local:9093"
      topic = "clo-topic"
       
      [sinks.kafka_app.encoding]
      codec = "json"
      timestamp_format = "rfc3339"
       
      # TLS Config
      [sinks.kafka_app.tls]
      ca_file = "/var/run/ocp-collector/secrets/kafka-fluentd/ca-bundle.crt"
      enabled = true
       
      # Sasl Config
      [sinks.kafka_app.sasl]
      user = "admin"
      password = "admin-secret"
      enabled = true 

      Attachments

        Activity

          People

            vimalkum@redhat.com Vimal Kumar
            rhn-support-ikanse Ishwar Kanse
            Ishwar Kanse Ishwar Kanse
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: