Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-5073

DML statement couldn't be parsed after adding a column to existing table

    XMLWordPrintable

Details

    • Bug
    • Resolution: Not a Bug
    • Major
    • None
    • 1.8.1.Final
    • oracle-connector
    • None
    • False
    • None
    • False
      1. Add a column to the table
      2. Insert rows into table

    Description

      In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.

      Bug report

      For bug reports, provide this information, please:

      What Debezium connector do you use and what version?

      1.8.1 Final 

      What is the connector configuration?

      apiVersion: kafka.strimzi.io/v1alpha1
      kind: KafkaConnector
      metadata:
        name: dfc-debezium-oracle-cdc-source-connector-v3
        namespace: kite-ent-ns
        labels:
          strimzi.io/cluster: kite-ent-connect
      spec:
        class: io.debezium.connector.oracle.OracleConnector
        tasksMax: 1
        config:
          name: dfc-debezium-oracle-cdc-source-connector-v3
          database.connection.adapter: logminer
          database.url: jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=hostname)(PORT=1521))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=SERVICE_SVC)))
          rac.nodes: ip1,ip2
          database.dbname: DBNAME
          database.port: 1521
          database.password: **********
          database.user: DBZUSER
          log.mining.strategy: redo_log_catalog
          schema.include.list: SCHEMA1,SCHEMA2
          table.include.list: SCHEMA1.TABLE_TMP
          database.server.name: dbzsrvr
          snapshot.mode: initial
          snapshot.locking.mode: shared
          database.history.skip.unparseable.ddl: true
          database.history.kafka.bootstrap.servers: kite-ent-kafka-bootstrap:9093
          database.history.kafka.topic: dfc-database-perf-test-history-internal-topic
          database.history.consumer.security.protocol: SASL_SSL
          database.history.consumer.sasl.mechanism: SCRAM-SHA-512
          database.history.consumer.sasl.jaas.config: org.apache.kafka.common.security.scram.ScramLoginModule required username="kite-admin" password="*****";
          database.history.consumer.ssl.endpoint.identification.algorithm: ""
          database.history.consumer.ssl.truststore.location: /opt/kafka/external-configuration/connector-external-config/lowes-internal-truststore.jks
          database.history.consumer.ssl.truststore.password: ${*****}
          database.history.producer.security.protocol: SASL_SSL
          database.history.producer.sasl.mechanism: SCRAM-SHA-512
          database.history.producer.sasl.jaas.config: org.apache.kafka.common.security.scram.ScramLoginModule required username="kite-admin" password="*****";
          database.history.producer.ssl.truststore.location: /opt/kafka/external-configuration/c****.jks
          database.history.producer.ssl.truststore.password: $*****}
          database.history.producer.ssl.endpoint.identification.algorithm: ""
          key.converter.schemas.enable: true
          key.converter: org.apache.kafka.connect.json.JsonConverter
          value.converter.schemas.enable: true
          value.converter: org.apache.kafka.connect.json.JsonConverter
          poll.interval.ms: 30000
          provide.transaction.metadata: false
          time.precision.mode: connect
          transforms: dropPrefix
          transforms.dropPrefix.type: org.apache.kafka.connect.transforms.RegexRouter
          transforms.dropPrefix.regex: dbzsrvr.(.*)
          transforms.dropPrefix.replacement: $1

      What is the captured database version and mode of depoyment?

      Oracle 12cR2

      What behaviour do you expect?

      I expect the connector capture the new column added to the table and emit data of new column with other existing columns.

      What behaviour do you see?

      The connector fails with below error.

       

      Tasks:      Id:     0      State:  FAILED      Trace:  org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.              at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42)              at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:191)              at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:57)              at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:172)              at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:139)              at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:108)              at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)              at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)              at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)              at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)              at java.base/java.lang.Thread.run(Thread.java:834)Caused by: io.debezium.connector.oracle.logminer.parser.DmlParserException: DML statement couldn't be parsed. Please open a Jira issue with the statement 'insert into "SCHEMA1"."TABLE_TMP"("LUID","LUORIGIN","OFFSET_X0_NORMVALUE","OFFSET_X0_VALUE","OFFSET_X0_UNIT_ID","OFFSET_Y0_NORMVALUE","OFFSET_Y0_VALUE","OFFSET_Y0_UNIT_ID","OFFSET_Z0_NORMVALUE","OFFSET_Z0_VALUE","OFFSET_Z0_UNIT_ID","KEYDATA_DANGOODSPOINTS","KEYDATA_GOODSVOLUME_NORMVALUE","KEYDATA_GOODSVOLUME_VALUE","KEYDATA_GOODSVOLUME_UNIT_ID","KEYDATA_GROSSWEIGHT_NORMVALUE","KEYDATA_GROSSWEIGHT_VALUE","KEYDATA_GROSSWEIGHT_UNIT_ID","KEYDATA_NUMPRIMARYSOS","KEYDATA_WEIGHTKIND","VERSIONCOUNTING_COUNTER","VERSIONCOUNTING_EMITTER","HISTORY_CREATEDAT","HISTORY_CREATEDBY","HISTORY_MODIFIEDAT","HISTORY_MODIFIEDBY","LASTMOVEMENT_KIND","LASTMOVEMENT_TIME","STOLOC_STOLOCID","STOLOC_WHLOCID","LUIDKIND","LUKIND","SYSTEMLU","WHOLELU","EXTLUID","TAREWEIGHTMEASURED_NORMVALUE","TAREWEIGHTMEASURED_VALUE","TAREWEIGHTMEASURED_UNIT_ID","ORIGDATA_CUBATURE_CUBATUREID","INVFIELDS_INVBLOCK","INVFIELDS_LASTINVTIME","TRANSBLOCK","LOCPOSTINGINFO_SEQINITTIME","LOCPOSTINGINFO_SEQNAME","LOCPOSTINGINFO_SEQUENCE","PARKED","CUSTOMGROUP","SEQUENCEWITHINSTOCOMP","ORIENTATION","WAATTRS_CAGE","WAATTRS_EMPTY","WAATTRS_EMPTYSTACK","WAATTRS_MFSBOOKINGCOUNT","WAATTRS_MOVCNTGRP","WAATTRS_MOVCNTLASTINCTIME","WAATTRS_MOVCTNVALUE","MASTERLUKIND","ITEMBARCODE","PRJGAYLORDISFULL","PRJIDENTIFYBYSHIPPINGLABEL","PRJLUISMIXEDPALLETFORITEMSETUP","PRJPICKINGPOLYBAGISFULL","PRJRECSORTLUID","LOADAIDID","CUBATUREID","LUCOMPDIVID","FOGGRAPH_FOGGRAPHID","FOGGRAPH_WHLOCID","MASTER_LUID","MASTER_LUORIGIN","ROOTMASTER_LUID","ROOTMASTER_LUORIGIN","TEMPORARYSTOCK","DISCRIMINATOR","PRJSHIPPINGLABEL","PRJSDCSHIPPINGLABELFILENAME","PRJSDCSHPLABELSEQNR","INVFIELDS_LASTINVUSER","PRJOBDGROUPING","CUSTOMER_NAME") values ('100057322','L3311','1.05E-001','.105','m','0','0','m','0','0','m','0','1.852098747408E-002','1130.22','cu inch','2.1780144830289998E+001','48.017','lbs','1','MEASURED','11787','152362026451231317',TO_TIMESTAMP('2019-09-20 06:36:09.'),'RS TX Target [2a6fad]-1',TO_TIMESTAMP('2022-03-31 12:15:54.'),'RS TX Target [afd605]','TRANSPORT',TO_TIMESTAMP('2022-03-31 12:15:54.'),'NRA3201X34560Y03Z1','L3311','NO','NORMAL','1','0',NULL,'0','0',NULL,NULL,'0',NULL,'0',TO_TIMESTAMP('2022-03-31 08:41:31.'),'SEQ-mfsBin','52448','0',NULL,'0','O0','0','0','0','12489','RACK',TO_TIMESTAMP('2022-03-31 12:15:42.'),'2','NONE',NULL,'0','0','0','0',NULL,'SMALL_TOTE','SMALL_TOTE','SmallTote','gPal_SORT','L3311',NULL,NULL,NULL,NULL,NULL,'LoadUnit',NULL,NULL,NULL,NULL,NULL,NULL);'.  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.parseDmlStatement(AbstractLogMinerEventProcessor.java:901)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.lambda$handleDataEvent$5(AbstractLogMinerEventProcessor.java:712)  at io.debezium.connector.oracle.logminer.processor.memory.MemoryLogMinerEventProcessor.addToTransaction(MemoryLogMinerEventProcessor.java:211)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.handleDataEvent(AbstractLogMinerEventProcessor.java:711)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processRow(AbstractLogMinerEventProcessor.java:291)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processResults(AbstractLogMinerEventProcessor.java:241)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.process(AbstractLogMinerEventProcessor.java:187)  at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:178)  ... 9 more

      Do you see the same behaviour using the latest relesead Debezium version?

      We ran into issue on 1.8.1 version. This behaviour is not tested on 1.9.2

      Do you have the connector logs, ideally from start till finish?

      Tasks:      Id:     0      State:  FAILED      Trace:  org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.              at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42)              at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:191)              at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:57)              at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:172)              at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:139)              at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:108)              at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)              at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)              at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)              at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)              at java.base/java.lang.Thread.run(Thread.java:834)Caused by: io.debezium.connector.oracle.logminer.parser.DmlParserException: DML statement couldn't be parsed. Please open a Jira issue with the statement 'insert into "SCHEMA1"."TABLE_TMP"("LUID","LUORIGIN","OFFSET_X0_NORMVALUE","OFFSET_X0_VALUE","OFFSET_X0_UNIT_ID","OFFSET_Y0_NORMVALUE","OFFSET_Y0_VALUE","OFFSET_Y0_UNIT_ID","OFFSET_Z0_NORMVALUE","OFFSET_Z0_VALUE","OFFSET_Z0_UNIT_ID","KEYDATA_DANGOODSPOINTS","KEYDATA_GOODSVOLUME_NORMVALUE","KEYDATA_GOODSVOLUME_VALUE","KEYDATA_GOODSVOLUME_UNIT_ID","KEYDATA_GROSSWEIGHT_NORMVALUE","KEYDATA_GROSSWEIGHT_VALUE","KEYDATA_GROSSWEIGHT_UNIT_ID","KEYDATA_NUMPRIMARYSOS","KEYDATA_WEIGHTKIND","VERSIONCOUNTING_COUNTER","VERSIONCOUNTING_EMITTER","HISTORY_CREATEDAT","HISTORY_CREATEDBY","HISTORY_MODIFIEDAT","HISTORY_MODIFIEDBY","LASTMOVEMENT_KIND","LASTMOVEMENT_TIME","STOLOC_STOLOCID","STOLOC_WHLOCID","LUIDKIND","LUKIND","SYSTEMLU","WHOLELU","EXTLUID","TAREWEIGHTMEASURED_NORMVALUE","TAREWEIGHTMEASURED_VALUE","TAREWEIGHTMEASURED_UNIT_ID","ORIGDATA_CUBATURE_CUBATUREID","INVFIELDS_INVBLOCK","INVFIELDS_LASTINVTIME","TRANSBLOCK","LOCPOSTINGINFO_SEQINITTIME","LOCPOSTINGINFO_SEQNAME","LOCPOSTINGINFO_SEQUENCE","PARKED","CUSTOMGROUP","SEQUENCEWITHINSTOCOMP","ORIENTATION","WAATTRS_CAGE","WAATTRS_EMPTY","WAATTRS_EMPTYSTACK","WAATTRS_MFSBOOKINGCOUNT","WAATTRS_MOVCNTGRP","WAATTRS_MOVCNTLASTINCTIME","WAATTRS_MOVCTNVALUE","MASTERLUKIND","ITEMBARCODE","PRJGAYLORDISFULL","PRJIDENTIFYBYSHIPPINGLABEL","PRJLUISMIXEDPALLETFORITEMSETUP","PRJPICKINGPOLYBAGISFULL","PRJRECSORTLUID","LOADAIDID","CUBATUREID","LUCOMPDIVID","FOGGRAPH_FOGGRAPHID","FOGGRAPH_WHLOCID","MASTER_LUID","MASTER_LUORIGIN","ROOTMASTER_LUID","ROOTMASTER_LUORIGIN","TEMPORARYSTOCK","DISCRIMINATOR","PRJSHIPPINGLABEL","PRJSDCSHIPPINGLABELFILENAME","PRJSDCSHPLABELSEQNR","INVFIELDS_LASTINVUSER","PRJOBDGROUPING","CUSTOMER_NAME") values ('100057322','L3311','1.05E-001','.105','m','0','0','m','0','0','m','0','1.852098747408E-002','1130.22','cu inch','2.1780144830289998E+001','48.017','lbs','1','MEASURED','11787','152362026451231317',TO_TIMESTAMP('2019-09-20 06:36:09.'),'RS TX Target [2a6fad]-1',TO_TIMESTAMP('2022-03-31 12:15:54.'),'RS TX Target [afd605]','TRANSPORT',TO_TIMESTAMP('2022-03-31 12:15:54.'),'NRA3201X34560Y03Z1','L3311','NO','NORMAL','1','0',NULL,'0','0',NULL,NULL,'0',NULL,'0',TO_TIMESTAMP('2022-03-31 08:41:31.'),'SEQ-mfsBin','52448','0',NULL,'0','O0','0','0','0','12489','RACK',TO_TIMESTAMP('2022-03-31 12:15:42.'),'2','NONE',NULL,'0','0','0','0',NULL,'SMALL_TOTE','SMALL_TOTE','SmallTote','gPal_SORT','L3311',NULL,NULL,NULL,NULL,NULL,'LoadUnit',NULL,NULL,NULL,NULL,NULL,NULL);'.  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.parseDmlStatement(AbstractLogMinerEventProcessor.java:901)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.lambda$handleDataEvent$5(AbstractLogMinerEventProcessor.java:712)  at io.debezium.connector.oracle.logminer.processor.memory.MemoryLogMinerEventProcessor.addToTransaction(MemoryLogMinerEventProcessor.java:211)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.handleDataEvent(AbstractLogMinerEventProcessor.java:711)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processRow(AbstractLogMinerEventProcessor.java:291)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processResults(AbstractLogMinerEventProcessor.java:241)  at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.process(AbstractLogMinerEventProcessor.java:187)  at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:178)  ... 9 more

       

      How to reproduce the issue using our tutorial deployment?

      The connector fails after adding a new column to the table followed by insert operation.

      Feature request or enhancement

      For feature requests or enhancements, provide this information, please:

      Which use case/requirement will be addressed by the proposed feature?

      <Your answer>

      Implementation ideas (optional)

      <Your answer>

      Attachments

        1. connector_config.json
          3 kB
        2. database_history_topic.txt
          42 kB
        3. Error_Log.txt
          37 kB

        Activity

          People

            ccranfor@redhat.com Chris Cranford
            dbzusr_sri Sri M (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: