Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-9652

Debezium Server configuration import creates invalid pipelines

XMLWordPrintable

      Given the following Debezium Server configuration uploaded using the Debezium Server configuration import:

      debezium.source.connector.class=io.debezium.connector.postgresql.PostgresConnector
      debezium.source.bootstrap.servers=my-cluster-kafka-bootstrap.kafka.svc.cluster.local:9092
      debezium.source.offset.flush.interval.ms=0
      debezium.source.topic.prefix=server1
      debezium.source.slot.name=dbz_test
      debezium.source.publication.name=dbz_publication_test
      debezium.source.database.dbname=postgres
      debezium.source.database.hostname=postgres
      debezium.source.database.port=5432
      debezium.source.database.user=user
      debezium.source.database.password=password
      
      debezium.sink.type=kafka
      debezium.sink.kafka.producer.bootstrap.servers=my-cluster-kafka-bootstrap.kafka.svc.cluster.local:9092
      debezium.sink.kafka.producer.key.serializer=org.apache.kafka.common.serialization.StringSerializer
      debezium.sink.kafka.producer.value.serializer=org.apache.kafka.common.serialization.StringSerializer
      

      The pipeline when deployed fails with the following error:

      2025-11-05T10:25:00.323Z | Caused by: org.apache.kafka.common.config.ConfigException: Invalid value null for configuration key.serializer: must be non-null.
      2025-11-05T10:25:00.323Z |  at org.apache.kafka.clients.producer.ProducerConfig.appendSerializerToConfig(ProducerConfig.java:655)
      2025-11-05T10:25:00.323Z |  at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:301)
      2025-11-05T10:25:00.323Z |  at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:284)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.kafka.KafkaChangeConsumer.start(KafkaChangeConsumer.java:73)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.kafka.KafkaChangeConsumer_Bean.doCreate(Unknown Source)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.kafka.KafkaChangeConsumer_Bean.create(Unknown Source)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.kafka.KafkaChangeConsumer_Bean.create(Unknown Source)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.ChangeConsumerFactory.create(ChangeConsumerFactory.java:62)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.ChangeConsumerFactory_ClientProxy.create(Unknown Source)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.DebeziumServer.start(DebeziumServer.java:123)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.DebeziumServer_Bean.doCreate(Unknown Source)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.DebeziumServer_Bean.create(Unknown Source)
      2025-11-05T10:25:00.323Z |  at io.debezium.server.DebeziumServer_Bean.create(Unknown Source)
      2025-11-05T10:25:00.323Z |  at io.quarkus.arc.impl.AbstractSharedContext.createInstanceHandle(AbstractSharedContext.java:119)
      

      For the source, the debezium.source. prefix is not removed, which results in the source configuration looks like this

      with the destination maintaining the debezium.sink. prefix

      As I understand, the prefixes are used to know what configurations should be added to the source/destination records, so I believe our import process when creating the source/target should account for this.

        1. image-2025-11-05-05-26-32-157.png
          55 kB
          Chris Cranford
        2. image-2025-11-05-05-27-14-754.png
          28 kB
          Chris Cranford

              ishukla Indra Shukla
              ccranfor@redhat.com Chris Cranford
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated:
                Resolved: