Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-4990

Debezium Db2 Connector fails to handle default values in schema when is making the snapshot

    XMLWordPrintable

Details

    • Bug
    • Resolution: Done
    • Blocker
    • 2.0.0.CR1
    • 1.9.0.Final
    • db2-connector
    • None

    Description

      What Debezium connector do you use and what version?

      Db2 connector 1.9.0 version

      What is the connector configuration?

      Simplest one

      {
          "name": "db2-connector-5",
          "config": {
              "connector.class" : "io.debezium.connector.db2.Db2Connector",
              "tasks.max" : "1",
              "database.server.name" : "<>",
              "database.hostname" : "<>",
              "database.port" : "<>",
              "database.user" : "<>",
              "database.password" : "<>",
              "database.dbname" : "<>",
              "table.include.list": "<>",
              "database.history.kafka.bootstrap.servers" : "broker:29092",
              "database.history.kafka.topic": "schema-changes.<>",
              "include.schema.changes": "false",
              "value.converter.schemas.enable": "false",
              "key.converter.schemas.enable": "false"
              
          }
      }
      

      What is the captured database version and mode of depoyment?

      (E.g. on-premises, with a specific cloud provider, etc.)

      on premise, Db2 11.5

      What behaviour do you expect?

      The producer send all the events to kafka with the table information

      What behaviour do you see?

      The schema is not properly created and fail with this error

       

      {{[2022-04-13 14:29:45,477] ERROR Producer failure (io.debezium.pipeline.ErrorHandler)
      kafka-connect | io.debezium.DebeziumException: io.debezium.DebeziumException: org.apache.kafka.connect.errors.SchemaBuilderException: Invalid default value
      kafka-connect | at io.debezium.pipeline.source.AbstractSnapshotChangeEventSource.execute(AbstractSnapshotChangeEventSource.java:85)
      kafka-connect | at io.debezium.pipeline.ChangeEventSourceCoordinator.doSnapshot(ChangeEventSourceCoordinator.java:155)
      kafka-connect | at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:137)
      kafka-connect | at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:109)
      kafka-connect | at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
      kafka-connect | at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
      kafka-connect | at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
      kafka-connect | at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
      kafka-connect | at java.base/java.lang.Thread.run(Thread.java:829)
      kafka-connect | Caused by: io.debezium.DebeziumException: org.apache.kafka.connect.errors.SchemaBuilderException: Invalid default value
      kafka-connect | at io.debezium.relational.RelationalSnapshotChangeEventSource.lambda$createSchemaChangeEventsForTables$2(RelationalSnapshotChangeEventSource.java:280)
      kafka-connect | at io.debezium.pipeline.EventDispatcher.dispatchSchemaChangeEvent(EventDispatcher.java:310)
      kafka-connect | at io.debezium.relational.RelationalSnapshotChangeEventSource.createSchemaChangeEventsForTables(RelationalSnapshotChangeEventSource.java:275)
      kafka-connect | at io.debezium.relational.RelationalSnapshotChangeEventSource.doExecute(RelationalSnapshotChangeEventSource.java:118)
      kafka-connect | at io.debezium.pipeline.source.AbstractSnapshotChangeEventSource.execute(AbstractSnapshotChangeEventSource.java:76)
      kafka-connect | ... 8 more
      kafka-connect | Caused by: org.apache.kafka.connect.errors.SchemaBuilderException: Invalid default value
      kafka-connect | at org.apache.kafka.connect.data.SchemaBuilder.defaultValue(SchemaBuilder.java:131)
      kafka-connect | at io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:412)
      kafka-connect | at io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:147)
      kafka-connect | at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
      kafka-connect | at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
      kafka-connect | at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
      kafka-connect | at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
      kafka-connect | at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
      kafka-connect | at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
      kafka-connect | at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
      kafka-connect | at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
      kafka-connect | at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
      kafka-connect | at io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:145)
      kafka-connect | at io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:135)
      kafka-connect | at io.debezium.connector.db2.Db2DatabaseSchema.applySchemaChange(Db2DatabaseSchema.java:49)
      kafka-connect | at io.debezium.pipeline.EventDispatcher$SchemaChangeEventReceiver.schemaChangeEvent(EventDispatcher.java:547)
      kafka-connect | at io.debezium.relational.RelationalSnapshotChangeEventSource.lambda$createSchemaChangeEventsForTables$2(RelationalSnapshotChangeEventSource.java:277)
      kafka-connect | ... 12 more
      kafka-connect | Caused by: org.apache.kafka.connect.errors.DataException: Invalid Java object for schema "io.debezium.time.Date" with type INT32: class java.lang.String
      kafka-connect | at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:242)
      kafka-connect | at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:213)
      kafka-connect | at org.apache.kafka.connect.data.SchemaBuilder.defaultValue(SchemaBuilder.java:129)
      kafka-connect | ... 28 more }}

       

      Do you see the same behaviour using the latest relesead Debezium version?

      (Ideally, also verify with latest Alpha/Beta/CR version)

      I haven't validate with it due to ccranfor@redhat.com has requested to me create the Jira ticket

      Do you have the connector logs, ideally from start till finish?

      (You might be asked later to provide DEBUG/TRACE level log)

      I just add an attachment with one of the errors. But I have identified two:

      Caused by: org.apache.kafka.connect.errors.DataException: Invalid Java object for schema "io.debezium.time.Date" with type INT32: class java.lang.String 

      and

      Caused by: org.apache.kafka.connect.errors.DataException: Invalid Java object for schema "org.apache.kafka.connect.data.Decimal" with type BYTES: class java.lang.String

      How to reproduce the issue using our tutorial deployment?

      Maybe creating the following schemas (the red text is the problem):

       KZ1# char(36) PRIMARY KEY NOT NULL,
         GILT_AB date NOT NULL,
         GILT_BIS date NOT NULL,
         DOMWERT1 char(1) NOT NULL,
         DOMWERT1ANZ char(1) DEFAULT ' ' NOT NULL,
         KZKERNDOMW char(1) DEFAULT ' ' NOT NULL,
         DOMNAME varchar(18) NOT NULL,
         SORTIERUNG decimal(3,0) DEFAULT 0.0 NOT NULL,
         DOMTEXT varchar(254) DEFAULT '' NOT NULL,
         DOMTEXT_EXT varchar(254) DEFAULT '' NOT NULL,
         PLS_LISTE varchar(254) DEFAULT '' NOT NULL,
         FIRMENLISTE varchar(254) DEFAULT '' NOT NULL

       

      Or

       

       VTAR# char(36) NOT NULL,
         PSCHABL# char(36) NOT NULL,
         GEWINNVERBAND char(7) DEFAULT ' ' NOT NULL,
         VERWENDTYP char(1) DEFAULT ' ' NOT NULL,
         PROVTARIFGRP char(2) DEFAULT ' ' NOT NULL,
         GILTAB date DEFAULT '0001-01-01' NOT NULL,
         GILTBIS date DEFAULT '0001-01-01' NOT NULL,
         CONSTRAINT SQL180423181051950 PRIMARY KEY (VTAR#,PSCHABL#)

       

      Attachments

        1. Date_DEFAULT_ERROR.txt
          13 kB
          Manuel Garcia de Vinuesa Gomez

        Activity

          People

            ccranfor@redhat.com Chris Cranford
            manugv84 Manuel Garcia de Vinuesa Gomez (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: