-
Bug
-
Resolution: Done
-
Normal
-
Logging 5.9.0
-
False
-
-
False
-
NEW
-
NEW
-
Release Note Not Required
-
-
-
Log Collection - Sprint 250
-
Moderate
Description of problem:
compression is not enable for kafka output.
According to https://vector.dev/docs/reference/configuration/sinks/kafka/#compression. kafka plugin support the following compression methods.
gzip lz4 none snappy zstd
How reproducible:
Always
Steps to Reproduce:
#Deploy CLF with tuning options
cat <<EOF | oc apply -f -
apiVersion: logging.openshift.io/v1
kind: ClusterLogForwarder
metadata:
name: clf-logs
spec:
inputs:
- application:
namespaces:
- project-qa-1
name: myLogsQA
outputs:
- name: kafka-app
url: tcp://my-cluster-kafka-bootstrap.amq-aosqe.svc:9092/topic-logging-app
type: kafka
secret:
name: to-kafka-secret
tuning:
compression: 'gzip'
delivery: AtMostOnce
maxWrite: 10M
minRetryDuration: 5
maxRetryDuration: 10
pipelines:
- inputRefs:
- myLogsQA
name: pipe1
outputRefs:
- kafka-app
serviceAccountName: clf-to-kafka
EOF
- check the vector.toml
- ...
Actual results:
Expected results:
compression: xx is added in vector.toml
Add lz4 to tuning.compression value list
Additional info:
- is cloned by
-
LOG-5178 compression should not be supported by googleCloudLogging
-
- Closed
-
- links to