-
Bug
-
Resolution: Done
-
Major
-
1.1.0.CR1
-
None
Oplog events are failing to get processed because of an incorrect field specification in MongoDbEventMetadataProvider. Here is the error from the log:
2020-03-16 19:48:26,471 ERROR MongoDB|Qa|task1 Producer failure [io.debezium.pipeline.ErrorHandler] org.apache.kafka.connect.errors.ConnectException: Error while processing event at offset {sec=1584388106, ord=1, h=6099280404115304850} at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:178) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.handleOplogEvent(MongoDbStreamingChangeEventSource.java:350) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.readOplog(MongoDbStreamingChangeEventSource.java:192) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$null$0(MongoDbStreamingChangeEventSource.java:96) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:278) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$null$1(MongoDbStreamingChangeEventSource.java:95) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: org.apache.kafka.connect.errors.DataException: ts_ms is not a valid field name at org.apache.kafka.connect.data.Struct.lookupField(Struct.java:254) at org.apache.kafka.connect.data.Struct.getCheckType(Struct.java:261) at org.apache.kafka.connect.data.Struct.getInt64(Struct.java:130) at io.debezium.connector.mongodb.MongoDbEventMetadataProvider.getEventTimestamp(MongoDbEventMetadataProvider.java:35) at io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics.onEvent(StreamingChangeEventSourceMetrics.java:82) at io.debezium.pipeline.EventDispatcher$2.changeRecord(EventDispatcher.java:161) at io.debezium.connector.mongodb.MongoDbChangeRecordEmitter.createAndEmitChangeRecord(MongoDbChangeRecordEmitter.java:100) at io.debezium.connector.mongodb.MongoDbChangeRecordEmitter.emitUpdateRecord(MongoDbChangeRecordEmitter.java:77) at io.debezium.connector.mongodb.MongoDbChangeRecordEmitter.emitUpdateRecord(MongoDbChangeRecordEmitter.java:27) at io.debezium.pipeline.AbstractChangeRecordEmitter.emitChangeRecords(AbstractChangeRecordEmitter.java:41) at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:154) ... 10 more 2020-03-16 19:48:26,614 INFO || WorkerSourceTask{id=qa-kafka-mongo-connector-1} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-03-16 19:48:26,615 INFO || WorkerSourceTask{id=qa-kafka-mongo-connector-1} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-03-16 19:48:26,618 ERROR || WorkerSourceTask{id=qa-kafka-mongo-connector-1} Task threw an uncaught and unrecoverable exception [org.apache.kafka.connect.runtime.WorkerTask] org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.handleOplogEvent(MongoDbStreamingChangeEventSource.java:358) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.readOplog(MongoDbStreamingChangeEventSource.java:192) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$null$0(MongoDbStreamingChangeEventSource.java:96) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:278) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$null$1(MongoDbStreamingChangeEventSource.java:95) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: org.apache.kafka.connect.errors.ConnectException: Error while processing event at offset {sec=1584388106, ord=1, h=6099280404115304850} at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:178) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.handleOplogEvent(MongoDbStreamingChangeEventSource.java:350) ... 9 more Caused by: org.apache.kafka.connect.errors.DataException: ts_ms is not a valid field name at org.apache.kafka.connect.data.Struct.lookupField(Struct.java:254) at org.apache.kafka.connect.data.Struct.getCheckType(Struct.java:261) at org.apache.kafka.connect.data.Struct.getInt64(Struct.java:130) at io.debezium.connector.mongodb.MongoDbEventMetadataProvider.getEventTimestamp(MongoDbEventMetadataProvider.java:35) at io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics.onEvent(StreamingChangeEventSourceMetrics.java:82) at io.debezium.pipeline.EventDispatcher$2.changeRecord(EventDispatcher.java:161) at io.debezium.connector.mongodb.MongoDbChangeRecordEmitter.createAndEmitChangeRecord(MongoDbChangeRecordEmitter.java:100) at io.debezium.connector.mongodb.MongoDbChangeRecordEmitter.emitUpdateRecord(MongoDbChangeRecordEmitter.java:77) at io.debezium.connector.mongodb.MongoDbChangeRecordEmitter.emitUpdateRecord(MongoDbChangeRecordEmitter.java:27) at io.debezium.pipeline.AbstractChangeRecordEmitter.emitChangeRecords(AbstractChangeRecordEmitter.java:41) at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:154) ... 10 more
- is caused by
-
DBZ-1726 Move MongoDB connector to base framework
- Closed