-
Bug
-
Resolution: Unresolved
-
Major
-
None
-
None
-
False
-
-
False
In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.
Bug report
For bug reports, provide this information, please:
What Debezium connector do you use and what version?
3.3.0 Final
What is the connector configuration?
class: io.debezium.connector.postgresql.PostgresConnector
tasksMax: 1
# https://debezium.io/documentation/reference/stable/connectors/postgresql.html#postgresql-required-configuration-properties
config:
#################
# debezium config
#################
database.hostname: "mypostgre" # hostname of secondary
database.port: "32521" # secondary port
database.user: northwind # username with login and replication rights
database.password: mypassword # password
database.dbname: northwind # name of db to sync
plugin.name: pgoutput
slot.name: debezium_replication_slot
slot.failover: true
slot.drop.on.stop: false
publication.name: debezium_publication
publication.autocreate.mode: disabled
# slot.initial.actions: none
# prefix for generated kafka topics
topic.prefix: dev.poc_postgres.northwind
# list of schemas to fetch, comma separated
schema.include.list: public
# list of tables to fetch, comma separated
#table.include.list: public.<TABELLE_1>, public.<TABELLE_2>
#################
# kafka connect config
#################
key.converter: org.apache.kafka.connect.json.JsonConverter
#key.converter: org.apache.kafka.connect.storage.StringConverter #io.confluent.connect.avro.AvroConverter
# serializer config - how to serialize data, possible values: json, avro, protobuf, string, bytearray, etc.
value.converter: io.confluent.connect.avro.AvroConverter
#value.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schema.registry.url: http://apicurio-schema-registry-app-service:8080/apis/ccompat/v7
schema.change.data.capture.event.logging.enabled: "true"
schema.history.internal.kafka.bootstrap.servers: "kafka-dev-broker-plain-0:9092,kafka-dev-broker-plain-1:9092,kafka-dev-broker-plain-2:9092"
schema.history.internal.kafka.topic: "__debezium_postgres_poc_northwind"
# Shows schema information in message (true for verbose))
key.converter.schemas.enable: "false"
value.converter.schemas.enable: "false"
# Optional: Einstellungen für Kafka Connect, um Deletes als Tombstones zu behandeln
delete.handling.mode: rewrite
slot.max.retries: "10" # Optional: Mehr Versuche
slot.retry.delay.ms: "10000" # Optional: längere Wartezeit zwischen Versuchen (10s)
# exactly once support, not bound to debezium but for kafka connect
exactly.once.support: required
openlineage.integration.enabled: true
# does not work with composite
#openlineage.integration.config.file.path: "/mnt/openlineage/marquez-connect-config.yml"
What is the captured database version and mode of deployment?
postgres 18, k8s cloudnativepg operator
What behavior do you expect?
composite config is being used correctly, maybe also bug in openlineage itself
debezium does not send its configuration into kafka
What behavior do you see?
composite problem:
does not work. needs to be set by env variables in order to work like:
env:
- name: OPENLINEAGE_TRANSPORT_TYPE
value: "composite"
- name: OPENLINEAGE_TRANSPORT_CONTINUE_ON_FAILURE
value: "true"
- name: OPENLINEAGE_TRANSPORTTRANSPORTSMARQUEZ_TYPE
value: "http"
- name: OPENLINEAGE_TRANSPORTTRANSPORTSMARQUEZ_URL
value: "http://marquez:5000"
- name: OPENLINEAGE_TRANSPORTTRANSPORTSMARQUEZ_ENDPOINT
value: "api/v1/lineage"
- name: OPENLINEAGE_TRANSPORTTRANSPORTSMARQUEZ_COMPRESSION
value: "gzip"
- name: OPENLINEAGE_TRANSPORTTRANSPORTSKAFKA_TYPE
value: "kafka"
- name: OPENLINEAGE_TRANSPORTTRANSPORTSKAFKA_TOPIC_NAME
value: "dev.openlineage.postgres.debezium"
- name: OPENLINEAGE_TRANSPORTTRANSPORTSKAFKA_PROPERTIES
value: "mykafka:9090"
Do you see the same behaviour using the latest released Debezium version?
yes 3.3.0 final
Do you have the connector logs, ideally from start till finish?
Yes but logs are all fine
How to reproduce the issue using our tutorial deployment?
look at the created kafka messages from debezium from openlineage. they send the whole debezium config with it, so with db credentials etc.
Feature request or enhancement
do not send config of debezium into openlineage kafka topci
Which use case/requirement will be addressed by the proposed feature?
security
Implementation ideas (optional)
openlineage configuration