Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-9031

Column named SEQUENCE, a MySQL keyword fails to be parsed

XMLWordPrintable

    • Moderate

      Bug report

      What Debezium connector do you use and what version?

      We use the debezium-connector for mysql. Issue arise in debezium-server version 3.0.8.FINAL and 3.1.1.FINAL

      What is the connector configuration?

       

      debezium:
        format:
          schemas:
            enable: false
        source:
          topic:
            prefix: "test"
          heartbeat:
            interval:
              ms: 300000
          connector:
            class: io.debezium.connector.mysql.MySqlConnector
          database:
            allowPublicKeyRetrieval: true
            ssl:
              mode: "required"
            user: ${DEBEZIUM_USERNAME}
            password: ${DEBEZIUM_PASSWORD}
            hostname: ${DEBEZIUM_HOSTNAME}
            port: ${DEBEZIUM_PORT}
            server:
              name: everesst
              id: ${DEBEZIUM_DATABASE_SERVER_ID}
            include:
              list: ${debezium.source.table.include.prefix}.*
          schema:
            history:
              internal:
                ".": io.debezium.storage.file.history.FileSchemaHistory
                file.filename: data/history.dat
                store.only.captured.tables.ddl: true
          poll.interval.ms: 100
          offset:
            flush.interval.ms: 0
            storage:
              ".": be.everesst.io.debezium.server.FileOffsetBackingStore
              file.filename: data/offsets.dat
          snapshot:
            mode: initial
            locking:
              mode: none
          table:
            include:
              prefix: ${KUBERNETES_NAMESPACE_NAME}_
              list: ${debezium.source.table.include.prefix}.*\\.PUBLIC_EVENT_RECORD
          key:
            converter:
              class: org.apache.kafka.connect.storage.StringConverter
          value:
            converter:
              class: org.apache.kafka.connect.storage.StringConverter
          header:
            converter:
              class: org.apache.kafka.connect.storage.StringConverter
        sink:
          type: pulsar
          pulsar:
            namespace: ${KUBERNETES_NAMESPACE_NAME}.pulsar-public-instance
            tenant: serviceBrokerPublic
            client:
              useTls: true
              tlsAllowInsecureConnection: false
              authPluginClassName: org.apache.pulsar.client.impl.auth.AuthenticationTls
              authParams: "tlsCertFile:/certs/tls.crt,tlsKeyFile:/certs/tls.key"
              serviceUrl: ${PULSAR_BROKERURL}
              tlsTrustCertsFilePath: ${PULSAR_CA_CERT_PATH}
            producer:
              batchingEnabled: false
        transforms:
          ".": defaultKey,defaultPayload,defaultId,tracing,extract,createKey,extractKey,idToHeaders,extractPayload
          defaultKey:
            type: org.apache.kafka.connect.transforms.InsertField$Value
            static:
              field: MESSAGE_KEY
              value: placeholder_message_key
          defaultPayload:
            type: org.apache.kafka.connect.transforms.InsertField$Value
            static:
              field: PAYLOAD
              value: '{}'
          defaultId:
            type: org.apache.kafka.connect.transforms.InsertField$Value
            static:
              field: ID
              value: placeholder_id
          extract:
            type: io.debezium.transforms.ExtractNewRecordState
            route.by.field: TOPIC
          createKey:
            type: org.apache.kafka.connect.transforms.ValueToKey
            fields: MESSAGE_KEY
          extractKey:
            type: org.apache.kafka.connect.transforms.ExtractField$Key
            field: MESSAGE_KEY
          idToHeaders:
            type: org.apache.kafka.connect.transforms.HeaderFrom$Value
            fields: ID
            headers: ID
            operation: copy
          extractPayload:
            type: org.apache.kafka.connect.transforms.ExtractField$Value
            field: PAYLOAD
          log:
            type: be.everesst.transform.LogKeyValueHeaders
       

       

       

      What is the captured database version and mode of deployment?

      Percona xtraDB Cluster 8.0.40-31

      Private cloud deployed in kubernetes

      What behavior do you expect?

      DDL parsing should work correctly and not throwing errors.

      What behavior do you see?

      Debezium throws an error: DDL statements could not be parsed

       

      Do you see the same behaviour using the latest released Debezium version?

      We currently use `3.0.7.FINAL`, we cannot upgrade to `3.0.8.FINAL` or `3.1.1.FINAL`

      Do you have the connector logs, ideally from start till finish?

      [{\"class\":\"org.antlr.v4.runtime.atn.ParserATNSimulator\",\"line\":2014,\"method\":\"noViableAlt\"},{\"class\":\"org.antlr.v4.runtime.atn.ParserATNSimulator\",\"line\":445,\"method\":\"execATN\"},{\"class\":\"org.antlr.v4.runtime.atn.ParserATNSimulator\",\"line\":371,\"method\":\"adaptivePredict\"},{\"class\":\"io.debezium.ddl.parser.mysql.generated.MySqlParser\",\"line\":1056,\"method\":\"sqlStatements\"},{\"class\":\"io.debezium.ddl.parser.mysql.generated.MySqlParser\",\"line\":980,\"method\":\"root\"},{\"class\":\"io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser\",\"line\":74,\"method\":\"parseTree\"},{\"class\":\"io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser\",\"line\":48,\"method\":\"parseTree\"},{\"class\":\"io.debezium.antlr.AntlrDdlParser\",\"line\":76,\"method\":\"parse\"},{\"class\":\"io.debezium.connector.binlog.BinlogDatabaseSchema\",\"line\":311,\"method\":\"parseDdl\"},{\"class\":\"io.debezium.connector.binlog.BinlogDatabaseSchema\",\"line\":258,\"method\":\"parseStreamingDdl\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":738,\"method\":\"handleQueryEvent\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":179,\"method\":\"lambda$execute$5\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":571,\"method\":\"handleEvent\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":209,\"method\":\"lambda$execute$17\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient\",\"line\":1281,\"method\":\"notifyEventListeners\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient\",\"line\":1103,\"method\":\"listenForEventPackets\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient\",\"line\":657,\"method\":\"connect\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient$7\",\"line\":959,\"method\":\"run\"},{\"class\":\"java.lang.Thread\",\"line\":1583,\"method\":\"run\"}]
      
      
      [{\"class\":\"io.debezium.antlr.ParsingErrorListener\",\"line\":43,\"method\":\"syntaxError\"},{\"class\":\"org.antlr.v4.runtime.ProxyErrorListener\",\"line\":41,\"method\":\"syntaxError\"},{\"class\":\"org.antlr.v4.runtime.Parser\",\"line\":544,\"method\":\"notifyErrorListeners\"},{\"class\":\"org.antlr.v4.runtime.DefaultErrorStrategy\",\"line\":310,\"method\":\"reportNoViableAlternative\"},{\"class\":\"org.antlr.v4.runtime.DefaultErrorStrategy\",\"line\":136,\"method\":\"reportError\"},{\"class\":\"io.debezium.ddl.parser.mysql.generated.MySqlParser\",\"line\":1264,\"method\":\"sqlStatements\"},{\"class\":\"io.debezium.ddl.parser.mysql.generated.MySqlParser\",\"line\":980,\"method\":\"root\"},{\"class\":\"io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser\",\"line\":74,\"method\":\"parseTree\"},{\"class\":\"io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser\",\"line\":48,\"method\":\"parseTree\"},{\"class\":\"io.debezium.antlr.AntlrDdlParser\",\"line\":76,\"method\":\"parse\"},{\"class\":\"io.debezium.connector.binlog.BinlogDatabaseSchema\",\"line\":311,\"method\":\"parseDdl\"},{\"class\":\"io.debezium.connector.binlog.BinlogDatabaseSchema\",\"line\":258,\"method\":\"parseStreamingDdl\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":738,\"method\":\"handleQueryEvent\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":179,\"method\":\"lambda$execute$5\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":571,\"method\":\"handleEvent\"},{\"class\":\"io.debezium.connector.binlog.BinlogStreamingChangeEventSource\",\"line\":209,\"method\":\"lambda$execute$17\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient\",\"line\":1281,\"method\":\"notifyEventListeners\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient\",\"line\":1103,\"method\":\"listenForEventPackets\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient\",\"line\":657,\"method\":\"connect\"},{\"class\":\"com.github.shyiko.mysql.binlog.BinaryLogClient$7\",\"line\":959,\"method\":\"run\"},{\"class\":\"java.lang.Thread\",\"line\":1583,\"method\":\"run\"}]
      DDL statement couldn't be parsed. Please open a Jira issue with the statement 'CREATE TABLE `test`.PUBLIC_EVENT_RECORD (SEQUENCE BIGINT AUTO_INCREMENT NOT NULL, ID char(36) NOT NULL, PAYLOAD LONGTEXT NOT NULL, METADATA LONGTEXT NULL, MESSAGE_KEY VARCHAR(255) NOT NULL, TOPIC VARCHAR(255) NOT NULL, CONSTRAINT PK_PUBLIC_EVENT_RECORD PRIMARY KEY (SEQUENCE), CONSTRAINT PUBLIC_EVENT_RECORD_ID_UQ UNIQUE (ID))'\nno viable alternative at input 'CREATE TABLE `test`.PUBLIC_EVENT_RECORD (SEQUENCE BIGINT' 

      How to reproduce the issue using our tutorial deployment?

      Use mysql 8.0.40 in combination with debezium-server `3.0.8.Final` or `3.1.1.Final` and execute the DDL statement provided in the error logs

              ccranfor@redhat.com Chris Cranford
              thomasverhoeven1998 Thomas Verhoeven (Inactive)
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: