Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-3935

[Vector] [LokiStack] Collector should comply with the tlsSecurityProfile config set globally when using LokiStack as the default log store.

XMLWordPrintable

    • False
    • None
    • False
    • NEW
    • OBSDA-160 - Comply with OCP cluster-wide cryptographic policies
    • NEW
    • Log Collection - Sprint 234, Log Collection - Sprint 235

      Description of problem:

      When using LokiStack as the default log store, Vector collector should use the tlsSecurityProfile config set globally  in apiserver/cluster configuration for Loki sink. The TLS profile config when using the default LokiStack as the log store shouldn't require enabling the feature gate since we are setting the profile on the log store server side without any feature gate. 

      Version-Release number of selected component (if applicable):

      cluster-logging.v5.7.0

      loki-operator.v5.7.0

      How reproducible:

      Always

      Steps to Reproduce:

      *Set any TLS profile in the global apiserver/cluster configuration. By default if no profile is set, intermediate profile must be used by Cluster Logging.

      *Create a LokiStack instance.

      *Create a ClusterLogging instance with Vector as collector and using the LokiStack instance as the default log store.

      apiVersion: "logging.openshift.io/v1"
      kind: "ClusterLogging"
      metadata:
        name: "instance"
        namespace: openshift-logging
      spec:
        managementState: "Managed"
        logStore:
          type: "lokistack"
          lokistack:
            name: lokistack-instance
        collection:
          type: "vector" 

      *Extract and check the vector config, the cipher and min TLS version is set according to the TLS profile for the prometheus sink only.

      [sinks.default_loki_infra]
      type = "loki"
      inputs = ["default_loki_infra_dedot"]
      endpoint = "https://lokistack-instance-gateway-http.openshift-logging.svc:8080/api/logs/v1/infrastructure"
      out_of_order_action = "accept"
      healthcheck.enabled = false
      [sinks.default_loki_infra.encoding]
      codec = "json"
      [sinks.default_loki_infra.labels]
      kubernetes_container_name = "{{kubernetes.container_name}}"
      kubernetes_host = "${VECTOR_SELF_NODE_NAME}"
      kubernetes_namespace_name = "{{kubernetes.namespace_name}}"
      kubernetes_pod_name = "{{kubernetes.pod_name}}"
      log_type = "{{log_type}}"
      [sinks.default_loki_infra.tls]
      enabled = true
      ca_file = "/var/run/secrets/kubernetes.io/serviceaccount/service-ca.crt"
      # Bearer Auth Config
      [sinks.default_loki_infra.auth]
      strategy = "bearer"
      token = "eyJhbGciOiJSUzI1NiIsImtpZCI6ImRoVUplXzg2ZHktc1dYUzQ4Qk1neGtrZEZBN294T0tTeUp2Nl9tbG55blkifQ.eyJpc3MiOiJrdWJlcm5ldGVzL3NlcnZpY2VhY2NvdW50Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9uYW1lc3BhY2UiOiJvcGVuc2hpZnQtbG9nZ2luZyIsImt1YmVybmV0ZXMuaW8vc2VydmljZWFjY291bnQvc2VjcmV0Lm5hbWUiOiJsb2djb2xsZWN0b3ItdG9rZW4iLCJrdWJlcm5ldGVzLmlvL3NlcnZpY2VhY2NvdW50L3NlcnZpY2UtYWNjb3VudC5uYW1lIjoibG9nY29sbGVjdG9yIiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9zZXJ2aWNlLWFjY291bnQudWlkIjoiZTY3OWI4NzMtZDhlNC00ODMzLThhOTUtOTE5OTJiOGJkNGJhIiwic3ViIjoic3lzdGVtOnNlcnZpY2VhY2NvdW50Om9wZW5zaGlmdC1sb2dnaW5nOmxvZ2NvbGxlY3RvciJ9.M5W7upDUpvyWi6oKhdxvXOPnzXdRSirv32hiQxg4RNuPdhcSpTon4E-TZGEq4LEG_B1NtmTUAFOpbX9q8YlmEToUZNAX-PUH_UcfYGDdfWi66ynsRSfCPNXWDr2ojWBPmUzlzXhrlqtWQm7wJzTu2M-UnPzdjnscz9ljoR8IdoEzTi561G8oRbxBnYB5m92aw3QO4DiqGqqAQDTA0t8CSBWyxDdGPEvmkg2zAdC4w93soOEkbNXq7LRAtU84pdWluhPR6vt4TkPPuyRo5JLdyDtiWWCdIZsbg4Huysodlyaj7pMOqYG6XabQ6zOaB0YEkOrdjo7Yqki8seZUiWz3ZeamtofjsZvioE-hEZJlX7J9_ItaF9VaZ1ONEftgZhxmEulAatq9bDSq-0zSnM8uw7ze6LDD-7QQgr8TeAF0x_6k6ipcqIHg5_CakvaXFo5mUADqUr5iBML3sm_uBTg6NrrYt2OFx3o1lRGudS0AS2jlXUUjSdTrGrbVFqZZF49Skc4UMde-oooM_nRfagZt4msYL_10Nex01Buu228i2LkiroJ-F1RroyNo0rxRvIvFJQJNvCFAcQ3Xe4u20bFSQL50x3LxYrvHofv0Q1tr87Jh8h6vY13806K3GxeYg6a_eWghoY6lFlv0l4PZZm-PaoatkY3OK-Uh23ynOLl4jbc"
      [transforms.add_nodename_to_metric]
      type = "remap"
      inputs = ["internal_metrics"]
      source = '''
      .tags.hostname = get_env_var!("VECTOR_SELF_NODE_NAME")
      '''
      [sinks.prometheus_output]
      type = "prometheus_exporter"
      inputs = ["add_nodename_to_metric"]
      address = "[::]:24231"
      default_namespace = "collector"
      [sinks.prometheus_output.tls]
      enabled = true
      key_file = "/etc/collector/metrics/tls.key"
      crt_file = "/etc/collector/metrics/tls.crt"
      min_tls_version = "VersionTLS12"
      ciphersuites = "TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,ECDHE-ECDSA-AES128-GCM-SHA256,ECDHE-RSA-AES128-GCM-SHA256,ECDHE-ECDSA-AES256-GCM-SHA384,ECDHE-RSA-AES256-GCM-SHA384,ECDHE-ECDSA-CHACHA20-POLY1305,ECDHE-RSA-CHACHA20-POLY1305,DHE-RSA-AES128-GCM-SHA256,DHE-RSA-AES256-GCM-SHA384
      

      Additional info:

      When using LokiStack as the log storage, we are complying with the global TLS profile which dosn't require any feature gate. https://issues.redhat.com/browse/LOG-895 

       

            jcantril@redhat.com Jeffrey Cantrill
            rhn-support-ikanse Ishwar Kanse
            Ishwar Kanse Ishwar Kanse
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: