Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-6686

JDBC Sink Connector Fails on Loading Flat Data Containing Struct Type Fields from Kafka

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • 2.4.0.Alpha2
    • None
    • jdbc-connector
    • None

      In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.

      Bug report

      For bug reports, provide this information, please:

      What Debezium connector do you use and what version?

      debezium-connector-jdbc-2.3.0.Final

      What is the connector configuration?

      Source:

      {
          "name": "inventory-source-connector",
          "config": {
              "connector.class" : "io.debezium.connector.oracle.OracleConnector",
              "tasks.max" : "1",
              "database.server.name" : "oracle-db-source",
              "database.hostname" : "oracle-db-source",
              "database.port" : "1521",
              "database.user" : "c##logminer",
              "database.password" : "dbz",
              "database.dbname" : "XE",
              "database.out.server.name":"dbzxout",
              "database.oracle.version": "11",
              "schema.history.internal.kafka.bootstrap.servers" : "kafka:9092",
              "schema.history.internal.kafka.topic": "schema-changes.inventory",
              "database.connection.adapter": "logminer",
              "table.include.list" : "INVENTORY.TEST_TAB",
              "database.schema": "inventory",
              "errors.log.enable": "true",
              "snapshot.lock.timeout.ms":"5000",
              "include.schema.changes": "true",
              "snapshot.mode":"always",
              "decimal.handling.mode": "precise",
              "topic.prefix":  "oracle-db-source",
              "schema.history.internal.store.only.captured.databases.ddl": "true",
              "schema.history.internal.store.only.captured.tables.ddl": "true",
              "lob.enabled": "true",
              "transforms": "unwrap",
              "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
              "transforms.unwrap.drop.tombstones": "false",
              "transforms.unwrap.delete.handling.mode": "rewrite"
          }
      }

      Sink:

      {
          "name": "jdbc-sink-postgress-jdbc",
          "config": {
              "connector.class": "io.debezium.connector.jdbc.JdbcSinkConnector",
              "connection.url": "jdbc:postgresql://postgres:5432/inventory",
              "connection.username": "postgres",
              "connection.password": "postgres",
              "tasks.max": "1",
              "table.name.format": "TEST_TAB",
              "topics": "oracle-db-source.INVENTORY.TEST_TAB",
              "insert.mode": "upsert",
              "delete.enabled": "true",
              "primary.key.mode": "record_key",
              "primary.key.fields": "ID",
              "quote.identifiers":  "true",
              "schema.evolution": "basic"
          }
      }

       

      What is the captured database version and mode of depoyment?

      (E.g. on-premises, with a specific cloud provider, etc.)

      Source Database: Oracle Database 11g Express Edition (Docker image: oracleinanutshell/oracle-xe-11g:latest)
      Target Database: PostgreSQL 15.3 (Docker image: postgres)
      Deployment: Locally using Docker

      What behaviour do you expect?

      I created table in Oracle:

      create table inventory.test_tab
      (
      id number(7, 0),
      col_float float,
      dt timestamp,
      primary key (id)
      )

      I expect the source connector to load data from Oracle into Kafka, transforming them into flat format, then the sink connector to successfully load flat data from Kafka into PostgreSQL.

      What behaviour do you see?

      The data is successfully loaded into Kafka, but an error occurs when loading into PostgreSQL:

      org.apache.kafka.connect.errors.ConnectException: Failed to process a sink record
          at io.debezium.connector.jdbc.JdbcChangeEventSink.execute(JdbcChangeEventSink.java:72)
          at io.debezium.connector.jdbc.JdbcSinkConnectorTask.put(JdbcSinkConnectorTask.java:89)
          at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:583)
          at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:336)
          at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:237)
          at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:206)
          at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:202)
          at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:257)
          at org.apache.kafka.connect.runtime.isolation.Plugins.lambda$withClassLoader$1(Plugins.java:177)
          at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
          at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
          at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
          at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
          at java.base/java.lang.Thread.run(Thread.java:829)
      Caused by: org.apache.kafka.connect.errors.ConnectException: Received an unexpected message type that does not have an 'after' Debezium block
          at io.debezium.connector.jdbc.SinkRecordDescriptor$Builder.readSinkRecordNonKeyData(SinkRecordDescriptor.java:388)
          at io.debezium.connector.jdbc.SinkRecordDescriptor$Builder.build(SinkRecordDescriptor.java:257)
          at io.debezium.connector.jdbc.JdbcChangeEventSink.execute(JdbcChangeEventSink.java:66)
          ... 13 more

      This error occurs if the data type in Oracle causes the data in Kafka to be converted to the Struct format (this happens, for instance, with the Oracle 'float' type).
      If the 'float' type in the Oracle table is replaced with 'varchar2', the sink connector successfully loads data into PostgreSQL.

      Do you see the same behaviour using the latest relesead Debezium version?

      (Ideally, also verify with latest Alpha/Beta/CR version)

      I tested version 2.3 (Docker image: debezium/connect:2.3)

      Do you have the connector logs, ideally from start till finish?

      (You might be asked later to provide DEBUG/TRACE level log)

      yes

      How to reproduce the issue using our tutorial deployment?

      <Your answer>

       

        1. connect.log
          43.41 MB
          Valeriia Kapishevskaia

              rh-ee-mvitale Mario Fiore Vitale
              v_kapishevskaya Valeriia Kapishevskaia (Inactive)
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

                Created:
                Updated:
                Resolved: