Details
-
Bug
-
Resolution: Not a Bug
-
Blocker
-
None
-
1.7.0.Alpha1, 1.7.0.Final, 1.7.1.Final, 1.8.0.Alpha1
-
None
-
False
-
False
Description
Hello, I've been using Debezium for 5 months now, and the upgrade from 1.6.3->1.7+ has broken the Postgresql connector for me.
I'm using the connector on Strimzi with a custom docker image, which includes both Postgres and MongoDb connector. The PG connector seems to start OK, checking the snapshot status and flushing some events, but eventually fails with a message `No usable CloudEvents converters for connector type "postgresql"` (error from this line).
After some debugging, I found out the class was not loaded at all (the providers list is empty), which feels weird since the connector still starts a little before going down.
The issue isn't present on our mongodb connector, so I wonder if there's some config to add/change to make it work ?
I can reproduce this bug on every release superior to 1.7.0.Alpha1.
relevant part of the logs :
2021-12-09 09:17:17,041 ERROR WorkerSourceTask{id=metapro-postgresql-multitenant-dataset-tenants-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask) [task-thread-metapro-postgresql-multitenant-dataset-tenants-0] org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:206) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:132) at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:298) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:324) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:248) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:182) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:231) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: org.apache.kafka.connect.errors.DataException: No usable CloudEvents converters for connector type "postgresql" at io.debezium.converters.CloudEventsConverter.lookupCloudEventsProvider(CloudEventsConverter.java:268) at io.debezium.converters.CloudEventsConverter.fromConnectData(CloudEventsConverter.java:214) at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63) at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$2(WorkerSourceTask.java:298) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:156) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:190) ... 11 more