-
Bug
-
Resolution: Done
-
Minor
-
1.8.1.Final
-
None
-
False
-
None
-
False
In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.
Bug report
For bug reports, provide this information, please:
What Debezium connector do you use and what version?
1.8.1 Final
What is the connector configuration?
piVersion: kafka.strimzi.io/v1alpha1
kind: KafkaConnector
metadata:
name: dfc-DB-debezium-oracle-cdc-source-connector
namespace: kite-ent-ns
labels:
strimzi.io/cluster: kite-ent-connect
spec:
class: io.debezium.connector.oracle.OracleConnector
tasksMax: 1
config:
name: dfc-DB-debezium-oracle-cdc-source-connector
database.connection.adapter: logminer
database.url: jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=hostname)(PORT=1521))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=service)))
rac.nodes: IP1,IP2
database.dbname: ORCLDB
database.port: 1521
database.user: DBZUSR
database.password: ******
table.include.list: TABLE1, TABLE2
database.server.name: dbzdfcsrvr
snapshot.mode: initial
snapshot.locking.mode: shared
database.history.kafka.bootstrap.servers: kite-ent-kafka-bootstrap:9093
database.history.kafka.topic: dfc-database-history-internal-topic
database.history.consumer.security.protocol: SASL_SSL
database.history.consumer.sasl.mechanism: SCRAM-SHA-512
database.history.consumer.sasl.jaas.config: org.apache.kafka.common.security.scram.ScramLoginModule required username="kite-admin" password="*****";
database.history.consumer.ssl.endpoint.identification.algorithm: ""
database.history.consumer.ssl.truststore.location: *****
database.history.consumer.ssl.truststore.password: ****
database.history.producer.security.protocol: SASL_SSL
database.history.producer.sasl.mechanism: SCRAM-SHA-512
database.history.producer.sasl.jaas.config: org.apache.kafka.common.security.scram.ScramLoginModule required username="kite-admin" password="*****";
database.history.producer.ssl.truststore.location: *****
database.history.producer.ssl.truststore.password: ******
database.history.producer.ssl.endpoint.identification.algorithm: ""
key.converter.schemas.enable: true
key.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable: true
value.converter: org.apache.kafka.connect.json.JsonConverter
poll.interval.ms: 30000
provide.transaction.metadata: false
time.precision.mode: connect
transforms: dropPrefix
transforms.dropPrefix.type: org.apache.kafka.connect.transforms.RegexRouter
transforms.dropPrefix.regex: dbzdfcsrvr.(.*)
transforms.dropPrefix.replacement: $1
What is the captured database version and mode of depoyment?
(E.g. on-premises, with a specific cloud provider, etc.)
Oracle 19c
What behaviour do you expect?
The connector fails when we tried to add foreign key to a table.
What behaviour do you see?
I expect the connector ignore the alter table statements and continues to process CDC.
Do you see the same behaviour using the latest relesead Debezium version?
(Ideally, also verify with latest Alpha/Beta/CR version)
We have not tested in Debezium 1.9 version
Do you have the connector logs, ideally from start till finish?
(You might be asked later to provide DEBUG/TRACE level log)
I see below error.
Tasks:
Id: 0
State: FAILED
Trace: org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:191)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:57)
at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:172)
at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:139)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:108)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.debezium.text.ParsingException: DDL statement couldn't be parsed. Please open a Jira issue with the statement 'alter table AbsTDDesc add constraint F_AbsTDDesc_AbstractTr_1vl7ewo foreign key (deviceId, whLocId) references AbstractTransportDevice;'
no viable alternative at input 'constraint F_AbsTDDesc_AbstractTr_1vl7ewo foreign key (deviceId, whLocId) references AbstractTransportDevice;'
at io.debezium.antlr.ParsingErrorListener.syntaxError(ParsingErrorListener.java:43)
at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41)
at org.antlr.v4.runtime.Parser.notifyErrorListeners(Parser.java:544)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportNoViableAlternative(DefaultErrorStrategy.java:310)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportError(DefaultErrorStrategy.java:136)
at io.debezium.ddl.parser.oracle.generated.PlSqlParser.constraint_clauses(PlSqlParser.java)
at io.debezium.ddl.parser.oracle.generated.PlSqlParser.alter_table(PlSqlParser.java)
at io.debezium.ddl.parser.oracle.generated.PlSqlParser.unit_statement(PlSqlParser.java:2333)
at io.debezium.ddl.parser.oracle.generated.PlSqlParser.sql_script(PlSqlParser.java:2021)
at io.debezium.connector.oracle.antlr.OracleDdlParser.parseTree(OracleDdlParser.java:73)
at io.debezium.connector.oracle.antlr.OracleDdlParser.parseTree(OracleDdlParser.java:32)
at io.debezium.antlr.AntlrDdlParser.parse(AntlrDdlParser.java:82)
at io.debezium.connector.oracle.antlr.OracleDdlParser.parse(OracleDdlParser.java:68)
at io.debezium.connector.oracle.OracleSchemaChangeEventEmitter.emitSchemaChangeEvent(OracleSchemaChangeEventEmitter.java:85)
at io.debezium.pipeline.EventDispatcher.dispatchSchemaChangeEvent(EventDispatcher.java:308)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.handleSchemaChange(AbstractLogMinerEventProcessor.java:556)
at io.debezium.connector.oracle.logminer.processor.memory.MemoryLogMinerEventProcessor.handleSchemaChange(MemoryLogMinerEventProcessor.java:187)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processRow(AbstractLogMinerEventProcessor.java:277)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processResults(AbstractLogMinerEventProcessor.java:241)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.process(AbstractLogMinerEventProcessor.java:187)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:178)
... 9 more
Caused by: org.antlr.v4.runtime.NoViableAltException
at org.antlr.v4.runtime.atn.ParserATNSimulator.noViableAlt(ParserATNSimulator.java:2026)
at org.antlr.v4.runtime.atn.ParserATNSimulator.execATN(ParserATNSimulator.java:467)
at org.antlr.v4.runtime.atn.ParserATNSimulator.adaptivePredict(ParserATNSimulator.java:393)
How to reproduce the issue using our tutorial deployment?
The issue is reproducible.
Feature request or enhancement
For feature requests or enhancements, provide this information, please:
Which use case/requirement will be addressed by the proposed feature?
<Your answer>
Implementation ideas (optional)
<Your answer>