-
Bug
-
Resolution: Done
-
Major
-
None
-
None
-
False
-
None
-
False
In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.
Bug report
Exception in Kafka connector log file, other tables in schema work well except this one.
DDL error table
-- auto-generated definition create table SETRAN ( TXNUM VARCHAR2(100), TXDATE DATE, ACCTNO VARCHAR2(20), TXCD VARCHAR2(4), NAMT NUMBER default 0, CAMT VARCHAR2(50), REF VARCHAR2(50), DELTD VARCHAR2(1), AUTOID NUMBER(20) default 0 constraint SETRAN_UK unique, ACCTREF VARCHAR2(20), TLTXCD VARCHAR2(4), BKDATE DATE, TRDESC VARCHAR2(1000) ) / create index SETRAN_ACCTNO_IDX on SETRAN (ACCTNO) / create index SETRAN_TXNUM_TXDATE_IDX on SETRAN (TXNUM, TXDATE) /
What Debezium connector do you use and what version?
1.9.7.Final
What is the connector configuration?
{ "binary.handling.mode":"bytes", "connector.class":"io.debezium.connector.oracle.OracleConnector", "database.dbname":"orcl", "database.history.consumer.sasl.jaas.config":"org.apache.kafka.common.security.plain.PlainLoginModule required username=\"connector\" password=\"Y29ubmVjdG9y\";", "database.history.consumer.sasl.mechanism":"PLAIN", "database.history.consumer.security.protocol":"SASL_PLAINTEXT", "database.history.kafka.bootstrap.servers":" ani-node1:9092,ani-node2:9092", "database.history.kafka.topic":"cd.ddl-change", "database.history.producer.linger.ms":"100", "database.history.producer.sasl.jaas.config":"org.apache.kafka.common.security.plain.PlainLoginModule required username=\"connector\" password=\"Y29ubmVjdG9y\";", "database.history.producer.sasl.mechanism":"PLAIN", "database.history.producer.security.protocol":"SASL_PLAINTEXT", "database.history.skip.unparseable.ddl":"true", "database.history.store.only.captured.tables.ddl":"true", "database.hostname":"your.host", "database.password":"decrypted", "database.port":"1521", "database.server.name":"sit_source_flex_connector", "database.user":"ani_cdc", "decimal.handling.mode":"double", "header.converter":"org.apache.kafka.connect.json.JsonConverter", "heartbeat.interval.ms":"3000", "heartbeat.topics.prefix":"heartbeat", "key.converter":"io.apicurio.registry.utils.converter.AvroConverter", "key.converter.apicurio.registry.auto-register":"true", "key.converter.apicurio.registry.find-latest":"true", "key.converter.apicurio.registry.url":"http://10.82.81.128:9090/apis/registry/v2", "lob.enabled":"true", "log.mining.buffer.infinispan.cache.events":"<local-cache name=\"events\">\n <persistence passivation=\"false\">\n<file-store fetch-state=\"true\" read-only=\"false\" preload=\"true\" shared=\"false\" segmented=\"false\" path=\"/data/infinispan/flex/events\" />\n</persistence>\n</local-cache>", "log.mining.buffer.infinispan.cache.processed_transactions":"<local-cache name=\"processed_transactions\">\n <persistence passivation=\"false\">\n<file-store fetch-state=\"true\" read-only=\"false\" preload=\"true\" shared=\"false\" segmented=\"false\" path=\"/data/infinispan/flex/processed_transactions\" />\n</persistence>\n</local-cache>", "log.mining.buffer.infinispan.cache.schema_changes":"<local-cache name=\"schema_changes\">\n <persistence passivation=\"false\">\n<file-store fetch-state=\"true\" read-only=\"false\" preload=\"true\" shared=\"false\" segmented=\"false\" path=\"/data/infinispan/flex/schema_changes\" />\n</persistence>\n</local-cache>", "log.mining.buffer.infinispan.cache.transactions":"<local-cache name=\"transactions\">\n <persistence passivation=\"false\">\n<file-store fetch-state=\"true\" read-only=\"false\" preload=\"true\" shared=\"false\" segmented=\"false\" path=\"/data/infinispan/flex/transactions\" />\n</persistence>\n</local-cache>", "log.mining.buffer.type":"infinispan", "log.mining.session.max.ms":"3600000", "log.mining.strategy":"online_catalog", "message.key.columns":"", "name":"sit_source_flex_connector", "producer.override.linger.ms":"100", "signal.data.collection":",BOSIT.SETRAN", "skipped.operations":"t", "snapshot.locking.mode":"none", "snapshot.mode":"SCHEMA_ONLY", "table.include.list":",BOSIT.SETRAN", "time.precision.mode":"connect", "topic.creation.default.cleanup.policy":"delete", "topic.creation.default.compression.type":"zstd", "topic.creation.default.partitions":"1", "topic.creation.default.replication.factor":"2", "topic.creation.default.retention.ms":"259200000", "transforms":"dropPrefix,changeTopicCase", "transforms.AddPrefix.type":"org.apache.kafka.connect.transforms.RegexRouter", "transforms.changeTopicCase.from":"UPPER_UNDERSCORE", "transforms.changeTopicCase.to":"LOWER_UNDERSCORE", "transforms.changeTopicCase.type":"com.github.jcustenborder.kafka.connect.transform.common.ChangeTopicCase", "transforms.dropPrefix.regex":"sit_source_flex_connector.(BOSIT).(.*)", "transforms.dropPrefix.replacement":"$1__$2", "transforms.dropPrefix.type":"org.apache.kafka.connect.transforms.RegexRouter", "transforms.unwrap.drop.deletes":"false", "transforms.unwrap.drop.tombstones":"false", "value.converter":"io.apicurio.registry.utils.converter.AvroConverter", "value.converter.apicurio.registry.auto-register":"true", "value.converter.apicurio.registry.find-latest":"true", "value.converter.apicurio.registry.url":" http://10.82.81.128:9090/apis/registry/v2" }
What is the captured database version and mode of deployment?
on-premises
What behaviour do you expect?
message produce into Kafka topic.
What behaviour do you see?
[2023-05-19 19:00:30,573] INFO [sit_source_flex_connector|task-0] Getting database metadata for table 'ORCL.BOSIT.SETRAN' (io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor:951) [2023-05-19 19:00:30,809] INFO [sit_source_flex_connector|task-0] Connection gracefully closed (io.debezium.jdbc.JdbcConnection:962) [2023-05-19 19:00:30,809] INFO [sit_source_flex_connector|task-0] Table 'ORCL.BOSIT.SETRAN' is new and will now be captured. (io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor:922) [2023-05-19 19:00:30,940] WARN [sit_source_flex_connector|task-0] Ignoring unparsable DDL statement ' CREATE TABLE "BOSIT"."SETRAN" ( "TXNUM" VARCHAR2(100), "TXDATE" DATE, "ACCTNO" VARCHAR2(20), "TXCD" VARCHAR2(4), "NAMT" NUMBER DEFAULT 0, "CAMT" VARCHAR2(50), "REF" VARCHAR2(50), "DELTD" VARCHAR2(1), "AUTOID" NUMBER(20,0) DEFAULT 0, "ACCTREF" VARCHAR2(20), "TLTXCD" VARCHAR2(4), "BKDATE" DATE, "TRDESC" VARCHAR2(1000) ) DISABLE LOGICAL REPLICATION ; CREATE UNIQUE INDEX "BOSIT"."SETRAN_UK" ON "BOSIT"."SETRAN" ("AUTOID") ; ALTER TABLE "BOSIT"."SETRAN" ADD CONSTRAINT "SETRAN_UK" UNIQUE ("AUTOID") USING INDEX "BOSIT"."SETRAN_UK" ENABLE;': {} (io.debezium.connector.oracle.OracleSchemaChangeEventEmitter:88) io.debezium.text.ParsingException: DDL statement couldn't be parsed. Please open a Jira issue with the statement ' CREATE TABLE "BOSIT"."SETRAN" ( "TXNUM" VARCHAR2(100), "TXDATE" DATE, "ACCTNO" VARCHAR2(20), "TXCD" VARCHAR2(4), "NAMT" NUMBER DEFAULT 0, "CAMT" VARCHAR2(50), "REF" VARCHAR2(50), "DELTD" VARCHAR2(1), "AUTOID" NUMBER(20,0) DEFAULT 0, "ACCTREF" VARCHAR2(20), "TLTXCD" VARCHAR2(4), "BKDATE" DATE, "TRDESC" VARCHAR2(1000) ) DISABLE LOGICAL REPLICATION ; CREATE UNIQUE INDEX "BOSIT"."SETRAN_UK" ON "BOSIT"."SETRAN" ("AUTOID") ; ALTER TABLE "BOSIT"."SETRAN" ADD CONSTRAINT "SETRAN_UK" UNIQUE ("AUTOID") USING INDEX "BOSIT"."SETRAN_UK" ENABLE;' no viable alternative at input 'DISABLE LOGICAL' at io.debezium.antlr.ParsingErrorListener.syntaxError(ParsingErrorListener.java:43) at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41) at org.antlr.v4.runtime.Parser.notifyErrorListeners(Parser.java:544) at org.antlr.v4.runtime.DefaultErrorStrategy.reportNoViableAlternative(DefaultErrorStrategy.java:310) at org.antlr.v4.runtime.DefaultErrorStrategy.reportError(DefaultErrorStrategy.java:136) at io.debezium.ddl.parser.oracle.generated.PlSqlParser.relational_table(PlSqlParser.java:51695) at io.debezium.ddl.parser.oracle.generated.PlSqlParser.create_table(PlSqlParser.java:49797) at io.debezium.ddl.parser.oracle.generated.PlSqlParser.unit_statement(PlSqlParser.java:2528) at io.debezium.connector.oracle.antlr.OracleDdlParser.parseTree(OracleDdlParser.java:73) at io.debezium.connector.oracle.antlr.OracleDdlParser.parseTree(OracleDdlParser.java:32) at io.debezium.antlr.AntlrDdlParser.parse(AntlrDdlParser.java:82) at io.debezium.connector.oracle.antlr.OracleDdlParser.parse(OracleDdlParser.java:68) at io.debezium.connector.oracle.OracleSchemaChangeEventEmitter.emitSchemaChangeEvent(OracleSchemaChangeEventEmitter.java:84) at io.debezium.pipeline.EventDispatcher.dispatchSchemaChangeEvent(EventDispatcher.java:302) at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.dispatchSchemaChangeEventAndGetTableForNewCapturedTable(AbstractLogMinerEventProcessor.java:924) at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.getTableForDataEvent(AbstractLogMinerEventProcessor.java:822) at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.handleDataEvent(AbstractLogMinerEventProcessor.java:753) at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processRow(AbstractLogMinerEventProcessor.java:293) at io.debezium.connector.oracle.logminer.processor.infinispan.AbstractInfinispanLogMinerEventProcessor.processRow(AbstractInfinispanLogMinerEventProcessor.java:141) at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processResults(AbstractLogMinerEventProcessor.java:243) at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.process(AbstractLogMinerEventProcessor.java:189) at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:210) at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:60) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:174) at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:141) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:109) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: org.antlr.v4.runtime.NoViableAltException at org.antlr.v4.runtime.atn.ParserATNSimulator.noViableAlt(ParserATNSimulator.java:2026) at org.antlr.v4.runtime.atn.ParserATNSimulator.execATN(ParserATNSimulator.java:467) at org.antlr.v4.runtime.atn.ParserATNSimulator.adaptivePredict(ParserATNSimulator.java:393) at io.debezium.ddl.parser.oracle.generated.PlSqlParser.relational_table(PlSqlParser.java:51645) ... 26 more
Do you see the same behaviour using the latest relesead Debezium version?
I haven't tried, because of incompatibility with dependency.
Do you have the connector logs, ideally from start till finish?
(You might be asked later to provide DEBUG/TRACE level log)
<Your answer>
How to reproduce the issue using our tutorial deployment?
DBMS: Oracle (ver. Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.3.0.0.0)
Kafka version: 3.4.0
Feature request or enhancement
For feature requests or enhancements, provide this information, please:
Which use case/requirement will be addressed by the proposed feature?
<Your answer>
Implementation ideas (optional)
<Your answer>
- links to
-
RHEA-2023:120698 Red Hat build of Debezium 2.3.4 release