-
Bug
-
Resolution: Done
-
Major
-
1.9.5.Final
-
None
-
False
-
None
-
False
Bug report
After adding new table to existing connector, connector throws exception:
Caused by: Multiple parsing errors
io.debezium.text.ParsingException: DDL statement couldn't be parsed. Please open a Jira issue with the statement ' ....
What Debezium connector do you use and what version?
1.9.5.Final
What is the connector configuration?
{
"connector.class": "io.debezium.connector.oracle.OracleConnector",
"database.history.consumer.sasl.jaas.config": "${secrets:draak/draak-connect-user:sasl.jaas.config}",
"tasks.max": "1",
"database.history.kafka.topic": "t3.foff.history-changes",
"transforms": "FilterFields",
"database.history.consumer.ssl.truststore.password": "${secrets:draak/draak-cluster-ca-cert:ca.password}",
"database.history.consumer.security.protocol": "SASL_SSL",
"database.history.consumer.ssl.truststore.location": "/opt/kafka/external-configuration/cluster-ca-cert/ca.p12",
"log.mining.strategy": "online_catalog",
"tombstones.on.delete": "false",
"decimal.handling.mode": "string",
"transforms.FilterFields.error.mode": "throw_error",
"transforms.FilterFields.metainfo.service.url": "some_url",
"database.history.store.only.captured.tables.ddl": "true",
"heartbeat.topics.prefix": "__debezium-heartbeat",
"transforms.FilterFields.cache.expiration.minutes": "60",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"log.mining.transaction.retention.hours": "5",
"database.history.producer.sasl.jaas.config": "${secrets:draak/draak-connect-user:sasl.jaas.config}",
"database.history.producer.sasl.mechanism": "SCRAM-SHA-512",
"database.dbname": "tefoff31",
"database.user": "${secrets:draak/debezium-credentials:dbz_user}",
"database.connection.adapter": "logminer",
"database.history.producer.security.protocol": "SASL_SSL",
"database.history.kafka.bootstrap.servers": "${env:KAFKA_CONNECT_BOOTSTRAP_SERVERS}",
"database.history.producer.ssl.truststore.location": "/opt/kafka/external-configuration/cluster-ca-cert/ca.p12",
"database.url": "${secrets:draak/debezium-credentials:foff_url}",
"database.server.name": "T3.FOFF",
"transforms.FilterFields.type": "com.ahold.debezium.connect.transformation.FilterTableFieldsTransformation",
"heartbeat.interval.ms": "30000",
"log.mining.username.exclude.list": "user",
"key.converter.schemas.enable": "false",
"value.converter.schema.registry.url": "some_url",
"database.history.producer.ssl.truststore.password": "${secrets:draak/draak-cluster-ca-cert:ca.password}",
"database.password": "${secrets:draak/debezium-credentials:dbz_password}",
"value.converter.schemas.enable": "false",
"name": "debezium-t3-foff-source",
"table.include.list": "Table1, ....... , TableN",
"database.history.consumer.sasl.mechanism": "SCRAM-SHA-512",
"database.history.consumer.ssl.truststore.type": "PKCS12",
"snapshot.mode": "schema_only",
"database.history.producer.ssl.truststore.type": "PKCS12"
}
What is the captured database version and mode of depoyment?
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.13.0.0.0
What behaviour do you expect?
no errors
What behaviour do you see?
errors in logs - connector stops
Do you see the same behaviour using the latest relesead Debezium version?
Checked only with 1.9.5.Final
- is duplicated by
-
DBZ-5956 DDL statement couldn't be parsed exception, during incremental snapshot execution
- Closed