Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-3824

Collector - When forwarding logs to an external Loki instance without any client TLS certificates, the TLS security profile configuration is not included in the Vector configuration.

    XMLWordPrintable

Details

    • Log Collection - Sprint 233, Log Collection - Sprint 234

    Description

      Description of problem:

      When forwarding logs to an external Loki instance without any client TLS certs, the tlsSecurityProfile configuration is not added to Loki sink in the generated Vector config.

      Version-Release number of selected component (if applicable):

      Cluster-logging.v5.7.0

      Loki-operator.v5.7.0

      Server Version: 4.13.0-0.nightly-2023-03-19-052243

      How reproducible:

      Always

      Steps to Reproduce:

      *Add a tlsSecurityProfile in the global apiserver/cluster config. If no config is set, intermediate profile will be used.

      oc edit apiserver/cluster
       
      spec:
        audit:
          profile: Default
        tlsSecurityProfile:
          intermediate: {}
          type: Intermediate
      

      *Create secret in openshift-logging namespace for forwarding logs to an externally hosted HTTPS Loki instance.

      oc create secret generic loki-client -n openshift-logging --from-literal=username=<Loki Grafana username> --from-literal=password=<loki Grafana API token>

      *Create a CLF to forward to the external Loki instance with the tlsSecurityProfile feature gate enabled.

      apiVersion: logging.openshift.io/v1
      kind: ClusterLogForwarder
      metadata:
        name: instance
        namespace: openshift-logging
        annotations:
          logging.openshift.io/preview-tls-security-profile: enabled
      spec:
        outputs:
        - name: loki-server
          type: loki
          url: https://logs-prod3.grafana.net
          secret:
             name: loki-client
        pipelines:
          - name: to-loki
            inputRefs:
            - application
            outputRefs:
            - loki-server
      

      *Create a ClusterLogging instance.

      apiVersion: "logging.openshift.io/v1"
      kind: "ClusterLogging"
      metadata:
        name: "instance" 
        namespace: "openshift-logging"
      spec:
        managementState: "Managed"  
        collection:
          type: vector
      

      *Extract and check the Vector config.

      [sinks.loki_server]
      type = "loki"
      inputs = ["loki_server_remap"]
      endpoint = "https://logs-prod3.grafana.net"
      out_of_order_action = "accept"
      healthcheck.enabled = false
       
      [sinks.loki_server.encoding]
      codec = "json"
       
      [sinks.loki_server.labels]
      kubernetes_container_name = "{{kubernetes.container_name}}"
      kubernetes_host = "${VECTOR_SELF_NODE_NAME}"
      kubernetes_namespace_name = "{{kubernetes.namespace_name}}"
      kubernetes_pod_name = "{{kubernetes.pod_name}}"
      log_type = "{{log_type}}"
       
      # Basic Auth Config
      [sinks.loki_server.auth]
      strategy = "basic"
      user = "REDACTED"
      password = "REDACTED"
      [transforms.add_nodename_to_metric]
      type = "remap"
      inputs = ["internal_metrics"]
      source = '''
      .tags.hostname = get_env_var!("VECTOR_SELF_NODE_NAME")
      '''
      

      Additional info:

      *However if we add tls certs for the Loki output secret and create a CLF to forward logs to an external Lokistack instance, the tlsSecurityProfile ciphers and min TLS version is added to the generated Vector config.

      oc -n openshift-logging create secret generic lokistack-gateway-bearer-token --from-literal=token=$TOKEN  --from-literal=ca-bundle.crt="$(oc -n openshift-logging get cm lokistack-instance-ca-bundle -o json | jq -r '.data."service-ca.crt"')"
      apiVersion: logging.openshift.io/v1
      kind: ClusterLogForwarder
      metadata:
        name: instance
        namespace: openshift-logging
        annotations:
          logging.openshift.io/preview-tls-security-profile: enabled
      spec:
        outputs:
         - name: loki-app
           type: loki
           url: https://lokistack-instance-gateway-http.openshift-logging.svc:8080/api/logs/v1/application/
           secret:
             name: lokistack-gateway-bearer-token
         - name: loki-infra
           type: loki
           url: https://lokistack-instance-gateway-http.openshift-logging.svc:8080/api/logs/v1/infrastructure/
           secret:
             name: lokistack-gateway-bearer-token
         - name: loki-audit
           type: loki
           url: https://lokistack-instance-gateway-http.openshift-logging.svc:8080/api/logs/v1/audit/
           secret:
             name: lokistack-gateway-bearer-token
        pipelines:
         - name: send-app-logs
           inputRefs:
           - application
           outputRefs:
           - loki-app
         - name: send-infra-logs
           inputRefs:
           - infrastructure
           outputRefs:
           - loki-infra
         - name: send-audit-logs
           inputRefs:
           - audit
           outputRefs:
           - loki-audit
      

      Example sink config generated for app logs.

      [sinks.loki_app.tls]
      enabled = true
      min_tls_version = "VersionTLS12"
      ciphersuites = "TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,ECDHE-ECDSA-AES128-GCM-SHA256,ECDHE-RSA-AES128-GCM-SHA256,ECDHE-ECDSA-AES256-GCM-SHA384,ECDHE-RSA-AES256-GCM-SHA384,ECDHE-ECDSA-CHACHA20-POLY1305,ECDHE-RSA-CHACHA20-POLY1305,DHE-RSA-AES128-GCM-SHA256,DHE-RSA-AES256-GCM-SHA384"
      ca_file = "/var/run/ocp-collector/secrets/lokistack-gateway-bearer-token/ca-bundle.crt"

      Attachments

        Activity

          People

            jcantril@redhat.com Jeffrey Cantrill
            rhn-support-ikanse Ishwar Kanse
            Ishwar Kanse Ishwar Kanse
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: