-
Bug
-
Resolution: Obsolete
-
Minor
-
None
-
None
https://bugzilla.redhat.com/show_bug.cgi?id=1904372
[Document URL]
https://docs.openshift.com/container-platform/4.6/logging/cluster-logging-external.html#cluster-logging-collector-log-forward-kafka_cluster-logging-external
[Section Number and Name]
Forwarding logs to a Kafka broker
[Describe the issue]
In the example for sending to Kafka, it's possible to read:
~~~
...
- name: infra-logs
type: kafka
url: tls://kafka.devlab2.example.com:9093/infra-topic - name: audit-logs
...
~~~
In the infra-logs endpoint is defined to send using tls, but not secret defined
[Suggestions for improvement]
Include in the example the secret to send via tls or use tcp to be insecure. Then, if the example is using TLS, it could be something like:
~~~
...
- name: infra-logs
type: kafka
url: tls://kafka.devlab2.example.com:9093/infra-topic
secret:
name: kafka-secret-devlab2 - name: audit-logs
...
~~~
Or in the case that we want to show an example for using kafka with tcp:
~~~
...
- name: infra-logs
type: kafka
url:
http://kafka.devlab2.example.com:9093/infra-topic - name: audit-logs
...
~~~
Additional information:
- relates to
-
RHDEVDOCS-2755 [5.7 prereq] Create Logging API reference docs
-
- Closed
-