Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-1876

NullPointerException on delete in ExtractNewRecordState class

    Details

    • Type: Bug
    • Status: Closed (View Workflow)
    • Priority: Major
    • Resolution: Done
    • Affects Version/s: 1.0.3.Final, 1.1.0.CR1
    • Fix Version/s: 1.1.0.Final
    • Component/s: core-library
    • Labels:
      None
    • Steps to Reproduce:
      Hide

      Just try to delete a record in Postgres table. The record is well pushed to the topic, but connector crash and need to be restart manually.

      Show
      Just try to delete a record in Postgres table. The record is well pushed to the topic, but connector crash and need to be restart manually.

      Description

      When I'm trying to see the delete events in Kafka, the connector kill the task.

      My connector config :

      {"name": "postgres-source",
        "config": {"connector.class":"io.debezium.connector.postgresql.PostgresConnector",
          "tasks.max":"1",
          "database.hostname": "postgres",
          "database.port": "5432",
          "database.user": "postgres",
          "database.password": "postgres",
          "database.dbname" : "students",
          "database.server.name": "dbserver1",
          "database.whitelist": "students",
          "database.history.kafka.bootstrap.servers": "kafka:9092",
          "database.history.kafka.topic": "schema-changes.students",
          "key.converter": "org.apache.kafka.connect.storage.StringConverter",
          "value.converter": "io.confluent.connect.avro.AvroConverter",
          "key.converter.schemas.enable": "false",
          "value.converter.schemas.enable": "true",
          "value.converter.schema.registry.url": "http://schema-registry:8081",
          "transforms": "unwrap",
          "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
          "transforms.unwrap.add.headers": "op,table,lsn,source.ts_ms",
          "transforms.unwrap.drop.tombstones": "false",
          "transforms.unwrap.delete.handling.mode": "rewrite"
        }
      }
      

      The stacktrace :

       2020-03-13 08:57:50,317 ERROR  ||  WorkerSourceTask{id=postgres-source-0} Task threw an uncaught and unrecoverable exception   [org.apache.kafka.connect.runtime.WorkerTask]
      connect_1          | org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
      connect_1          |    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
      connect_1          |    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
      connect_1          |    at org.apache.kafka.connect.runtime.TransformationChain.apply(TransformationChain.java:50)
      connect_1          |    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:315)
      connect_1          |    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:240)
      connect_1          |    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
      connect_1          |    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
      connect_1          |    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
      connect_1          |    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
      connect_1          |    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
      connect_1          |    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
      connect_1          |    at java.base/java.lang.Thread.run(Thread.java:834)
      connect_1          | Caused by: java.lang.NullPointerException
      connect_1          |    at io.debezium.transforms.ExtractNewRecordState.makeHeaders(ExtractNewRecordState.java:206)
      connect_1          |    at io.debezium.transforms.ExtractNewRecordState.apply(ExtractNewRecordState.java:134)
      connect_1          |    at org.apache.kafka.connect.runtime.TransformationChain.lambda$apply$0(TransformationChain.java:50)
      connect_1          |    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
      connect_1          |    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
      connect_1          |    ... 11 more
      

        Gliffy Diagrams

          Attachments

            Activity

              People

              • Assignee:
                jpechanec Jiri Pechanec
                Reporter:
                sma-mco-t David Courtin
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: