Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-8142

The source data type exceeds the debezium data type and cannot deserialize the object

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • 3.0.0.CR1
    • 2.7.1.Final, 3.0.0.Alpha2
    • oracle-connector
    • None

      In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.

      Bug report

      For bug reports, provide this information, please:

      Oracle data type DATE value contain max value 4247-04-05 and Debezium type java.math.BigInteger can contain only 2262-05-11 if convert from milliseconds. I think in this case debezium get default value for this column witch is null, but column TODATE not nullable and I get exception:
      java.lang.IllegalArgumentException: Unexpected value for JDBC type 93 and column TODATE DATE NOT NULL: class=java.math.BigInteger

      What Debezium connector do you use and what version?

      2.6

      What is the connector configuration?

      Oracle 19 -> Openlogreplicator 1.6.1 -> Debezium 2.6.

      CI/CD Yaml file. It converts to debezium config row by row:

      name=colvir_1
      connector.class=io.debezium.connector.oracle.OracleConnector
      tasks.max=1
      database.hostname=10..52.
      database.port=1521
      database.oracle.version=19
      database.user=debezium
      database.password={}{}***
      database.dbname=CBSTEST
      schema.history.internal.kafka.topic=dwh-colvir1
      database.connection.adapter=olr
      topic.prefix=dwh-src
      openlogreplicator.source=CLV2_1
      openlogreplicator.host=openlog-replicator-1
      openlogreplicator.port=50000
      snapshot.mode=schema_only
      table.include.list=COLVIR.G_ACCBLNHST
      topic.creation.enable=true
      column.propagate.source.type=true
      datatype.propagate.source.type=true
      key.converter=org.apache.kafka.connect.json.JsonConverter
      value.converter=org.apache.kafka.connect.json.JsonConverter
      key.converter.schemas.enable=true
      value.converter.schemas.enable=true
      key.converter.replace.null.with.default=false
      value.converter.replace.null.with.default=false
      database.history.kafka.topic=dwh-src1.db
      schema.history.internal.store.only.captured.tables.ddl=true
      schema.history.internal.kafka.bootstrap.servers=kafka-debezium-dev..:9094
      topic.creation.default.replication.factor=3
      topic.creation.default.partitions=3
      heartbeat.interval.ms=10000
      topic.heartbeat.prefix=dwh-debez-heartbeat1
      security.protocol=SASL_PLAINTEXT
      schema.history.internal.sasl.mechanism=SCRAM-SHA-256
      schema.history.internal.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=\"dwh\" password=\"{}{}*\";
      schema.history.internal.consumer.security.protocol=SASL_PLAINTEXT
      schema.history.internal.consumer.sasl.mechanism=SCRAM-SHA-256
      schema.history.internal.consumer.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=\"dwh\" password=\"{}{*}\";
      schema.history.internal.producer.sasl.mechanism=SCRAM-SHA-256
      schema.history.internal.producer.security.protocol=SASL_PLAINTEXT
      schema.history.internal.producer.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=\"dwh\" password=\"{}{*}\";

      What is the captured database version and mode of deployment?

      (E.g. on-premises, with a specific cloud provider, etc.)

      kubernetes container

      What behavior do you expect?

      max debezium date value

      What behavior do you see?

      null value exception

      Do you see the same behaviour using the latest released Debezium version?

      (Ideally, also verify with latest Alpha/Beta/CR version)

      yes

      Do you have the connector logs, ideally from start till finish?

      (You might be asked later to provide DEBUG/TRACE level log)

      [2024-08-13 12:09:56,167] ERROR Failed to properly convert data value for 'CBSTEST.COLVIR.G_ACCBLNHST.TODATE' of type DATE (io.debezium.relational.TableSchemaBuilder)
      java.lang.IllegalArgumentException: Unexpected value for JDBC type 93 and column TODATE DATE NOT NULL: class=java.math.BigInteger
          at io.debezium.jdbc.JdbcValueConverters.handleUnknownData(JdbcValueConverters.java:1294)
          at io.debezium.jdbc.JdbcValueConverters.convertValue(JdbcValueConverters.java:1338)
          at io.debezium.jdbc.JdbcValueConverters.convertTimestampToEpochMillis(JdbcValueConverters.java:462)
          at io.debezium.connector.oracle.OracleValueConverters.convertTimestampToEpochMillis(OracleValueConverters.java:629)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorValueConverter.convertTimestampToEpochMillis(OpenLogReplicatorValueConverter.java:68)
          at io.debezium.jdbc.JdbcValueConverters.lambda$converter$16(JdbcValueConverters.java:337)
          at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$5(TableSchemaBuilder.java:297)
          at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:141)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitDeleteRecord(RelationalChangeRecordEmitter.java:136)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:56)
          at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:271)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onMutationEvent(OpenLogReplicatorStreamingChangeEventSource.java:299)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onEvent(OpenLogReplicatorStreamingChangeEventSource.java:181)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:121)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:52)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:280)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:197)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:140)
          at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
          at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
          at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
          at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
          at java.base/java.lang.Thread.run(Thread.java:829)
      [2024-08-13 12:09:56,169] ERROR Failed: Error while processing event at offset {commit_scn=null, scn_idx=8, snapshot_scn=26203277118, scn=26203291568} (io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource)
      org.apache.kafka.connect.errors.ConnectException: Error while processing event at offset {commit_scn=null, scn_idx=8, snapshot_scn=26203277118, scn=26203291568}
          at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:322)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onMutationEvent(OpenLogReplicatorStreamingChangeEventSource.java:299)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onEvent(OpenLogReplicatorStreamingChangeEventSource.java:181)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:121)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:52)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:280)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:197)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:140)
          at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
          at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
          at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
          at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
          at java.base/java.lang.Thread.run(Thread.java:829)
      Caused by: org.apache.kafka.connect.errors.DataException: Invalid value: null used for required field: "TODATE", schema type: INT64
          at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:220)
          at org.apache.kafka.connect.data.Struct.validate(Struct.java:233)
          at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:250)
          at org.apache.kafka.connect.data.Struct.put(Struct.java:216)
          at org.apache.kafka.connect.data.Struct.put(Struct.java:203)
          at io.debezium.data.Envelope.delete(Envelope.java:317)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitDeleteRecord(RelationalChangeRecordEmitter.java:143)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:56)
          at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:271)
          ... 12 more
      [2024-08-13 12:09:56,169] ERROR Producer failure (io.debezium.pipeline.ErrorHandler)
      org.apache.kafka.connect.errors.ConnectException: Error while processing event at offset {commit_scn=null, scn_idx=8, snapshot_scn=26203277118, scn=26203291568}
          at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:322)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onMutationEvent(OpenLogReplicatorStreamingChangeEventSource.java:299)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onEvent(OpenLogReplicatorStreamingChangeEventSource.java:181)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:121)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:52)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:280)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:197)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:140)
          at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
          at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
          at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
          at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
          at java.base/java.lang.Thread.run(Thread.java:829)
      Caused by: org.apache.kafka.connect.errors.DataException: Invalid value: null used for required field: "TODATE", schema type: INT64
          at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:220)
          at org.apache.kafka.connect.data.Struct.validate(Struct.java:233)
          at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:250)
          at org.apache.kafka.connect.data.Struct.put(Struct.java:216)
          at org.apache.kafka.connect.data.Struct.put(Struct.java:203)
          at io.debezium.data.Envelope.delete(Envelope.java:317)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitDeleteRecord(RelationalChangeRecordEmitter.java:143)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:56)
          at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:271)
          ... 12 more
      [2024-08-13 12:09:56,169] INFO Streaming metrics dump: io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSourceMetrics@618d03ff (io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource)
      [2024-08-13 12:09:56,169] INFO Offsets: OracleOffsetContext [scn=26203291568, scnIndex=8, commit_scn=[], lcr_position=null] (io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource)
      [2024-08-13 12:09:56,169] INFO Finished streaming (io.debezium.pipeline.ChangeEventSourceCoordinator)
      [2024-08-13 12:09:56,169] INFO Connected metrics set to 'false' (io.debezium.pipeline.ChangeEventSourceCoordinator)
      [2024-08-13 12:09:56,409] ERROR WorkerSourceTask{id=colvir_1-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask)
      org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
          at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:67)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:141)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:52)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:280)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:197)
          at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:140)
          at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
          at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
          at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
          at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
          at java.base/java.lang.Thread.run(Thread.java:829)
      Caused by: org.apache.kafka.connect.errors.ConnectException: Error while processing event at offset {commit_scn=null, scn_idx=8, snapshot_scn=26203277118, scn=26203291568}
          at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:322)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onMutationEvent(OpenLogReplicatorStreamingChangeEventSource.java:299)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.onEvent(OpenLogReplicatorStreamingChangeEventSource.java:181)
          at io.debezium.connector.oracle.olr.OpenLogReplicatorStreamingChangeEventSource.execute(OpenLogReplicatorStreamingChangeEventSource.java:121)
          ... 9 more
      Caused by: org.apache.kafka.connect.errors.DataException: Invalid value: null used for required field: "TODATE", schema type: INT64
          at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:220)
          at org.apache.kafka.connect.data.Struct.validate(Struct.java:233)
          at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:250)
          at org.apache.kafka.connect.data.Struct.put(Struct.java:216)
          at org.apache.kafka.connect.data.Struct.put(Struct.java:203)
          at io.debezium.data.Envelope.delete(Envelope.java:317)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitDeleteRecord(RelationalChangeRecordEmitter.java:143)
          at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:56)
          at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:271)
          ... 12 more
      [2024-08-13 12:09:56,410] INFO Stopping down connector (io.debezium.connector.common.BaseSourceTask)

      How to reproduce the issue using our tutorial deployment?

      Oracle DB with DATE type column and insert max DATE value 4247-04-05

      Feature request or enhancement

      For feature requests or enhancements, provide this information, please:

      Which use case/requirement will be addressed by the proposed feature?

      config parametr

      Implementation ideas (optional)

      May be we need to rewrite default value for this case

            ccranfor@redhat.com Chris Cranford
            agemlex Aleksey Kruglov
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: