[2022-08-02 13:09:49,221] INFO [document_db_connectorII|task-0] Opened connection [connectionId{localValue:25}] to xxxxxxx.docdb.amazonaws.com:27017 (org.mongodb.driver.connection:71) [2022-08-02 13:09:49,221] INFO [document_db_connectorII|task-0] Opened connection [connectionId{localValue:26}] to xxxxxxx.docdb.amazonaws.com:27017 (org.mongodb.driver.connection:71) [2022-08-02 13:09:49,221] INFO [document_db_connectorII|task-0] Monitor thread successfully connected to server with description ServerDescription{address=xxxxxxx.docdb.amazonaws.com:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=7, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=5133498, setName='rs0', canonicalAddress=xxxxxxx.docdb.amazonaws.com:27017, hosts=[xxxxxxx.amazonaws.com:27017, xxxxxxx.docdb.amazonaws.com:27017], passives=[], arbiters=[], primary='xxxxxxx.docdb.amazonaws.com:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000002, setVersion=null, topologyVersion=null, lastWriteDate=Tue Aug 02 13:09:49 UTC 2022, lastUpdateTimeNanos=4536246918629} (org.mongodb.driver.cluster:71) [2022-08-02 13:09:49,242] INFO [document_db_connectorII|task-0] Opened connection [connectionId{localValue:27}] to xxxxxxx.docdb.amazonaws.com:27017 (org.mongodb.driver.connection:71) [2022-08-02 13:09:49,245] INFO [document_db_connectorII|task-0] MongoClient with metadata {"driver": {"name": "mongo-java-driver|sync", "version": "4.6.1"}, "os": {"type": "Linux", "name": "Linux", "architecture": "amd64", "version": "5.15.0-1015-aws"}, "platform": "Java/Private Build/11.0.15+10-Ubuntu-0ubuntu0.22.04.1"} created with settings MongoClientSettings{readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, journal=null}, retryWrites=true, retryReads=true, readConcern=ReadConcern{level=null}, credential=MongoCredential{mechanism=null, userName='nishant', source='admin', password=, mechanismProperties=}, streamFactoryFactory=null, commandListeners=[], codecRegistry=ProvidersCodecRegistry{codecProviders=[ValueCodecProvider{}, BsonValueCodecProvider{}, DBRefCodecProvider{}, DBObjectCodecProvider{}, DocumentCodecProvider{}, IterableCodecProvider{}, MapCodecProvider{}, GeoJsonCodecProvider{}, GridFSFileCodecProvider{}, Jsr310CodecProvider{}, JsonObjectCodecProvider{}, BsonCodecProvider{}, EnumCodecProvider{}, com.mongodb.Jep395RecordCodecProvider@23597ef8]}, clusterSettings={hosts=[xxxxxxx.docdb.amazonaws.com:27017], srvServiceName=mongodb, mode=SINGLE, requiredClusterType=UNKNOWN, requiredReplicaSetName='null', serverSelector='null', clusterListeners='[]', serverSelectionTimeout='30000 ms', localThreshold='30000 ms'}, socketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=0, receiveBufferSize=0, sendBufferSize=0}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=10000, receiveBufferSize=0, sendBufferSize=0}, connectionPoolSettings=ConnectionPoolSettings{maxSize=100, minSize=0, maxWaitTimeMS=120000, maxConnectionLifeTimeMS=0, maxConnectionIdleTimeMS=0, maintenanceInitialDelayMS=0, maintenanceFrequencyMS=60000, connectionPoolListeners=[], maxConnecting=2}, serverSettings=ServerSettings{heartbeatFrequencyMS=10000, minHeartbeatFrequencyMS=500, serverListeners='[]', serverMonitorListeners='[]'}, sslSettings=SslSettings{enabled=false, invalidHostNameAllowed=false, context=null}, applicationName='null', compressorList=[], uuidRepresentation=UNSPECIFIED, serverApi=null, autoEncryptionSettings=null, contextProvider=null} (org.mongodb.driver.client:71) [2022-08-02 13:09:49,245] INFO [document_db_connectorII|task-0] Cluster description not yet available. Waiting for 30000 ms before timing out (org.mongodb.driver.cluster:71) [2022-08-02 13:09:49,250] INFO [document_db_connectorII|task-0] Opened connection [connectionId{localValue:28}] to xxxxxxx.docdb.amazonaws.com:27017 (org.mongodb.driver.connection:71) [2022-08-02 13:09:49,250] INFO [document_db_connectorII|task-0] Monitor thread successfully connected to server with description ServerDescription{address=xxxxxxx.docdb.amazonaws.com:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=7, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=4115690, setName='rs0', canonicalAddress=xxxxxxx.docdb.amazonaws.com:27017, hosts=[xxxxxxx.amazonaws.com:27017, xxxxxxx.docdb.amazonaws.com:27017], passives=[], arbiters=[], primary='xxxxxxx.docdb.amazonaws.com:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000002, setVersion=null, topologyVersion=null, lastWriteDate=Tue Aug 02 13:09:49 UTC 2022, lastUpdateTimeNanos=4536276362698} (org.mongodb.driver.cluster:71) [2022-08-02 13:09:49,250] INFO [document_db_connectorII|task-0] Opened connection [connectionId{localValue:29}] to xxxxxxx.docdb.amazonaws.com:27017 (org.mongodb.driver.connection:71) [2022-08-02 13:09:49,271] INFO [document_db_connectorII|task-0] Opened connection [connectionId{localValue:30}] to xxxxxxx.docdb.amazonaws.com:27017 (org.mongodb.driver.connection:71) [2022-08-02 13:09:49,273] INFO [document_db_connectorII|task-0] Reading change stream for 'rs0/xxxxxxx.docdb.amazonaws.com:27017' primary xxxxxxx.docdb.amazonaws.com:27017 starting at Timestamp{value=0, seconds=0, inc=0} (io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource:204) [2022-08-02 13:09:49,276] INFO [document_db_connectorII|task-0] Resume token not available, starting streaming from time 'Timestamp{value=0, seconds=0, inc=0}' (io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource:225) [2022-08-02 13:09:49,285] ERROR [document_db_connectorII|task-0] Streaming for replica set rs0 failed (io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource:121) org.apache.kafka.connect.errors.ConnectException: Error while attempting to read from change stream on 'rs0/xxxxxxx.docdb.amazonaws.com:27017' at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$establishConnectionToPrimary$3(MongoDbStreamingChangeEventSource.java:165) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:286) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.streamChangesForReplicaSet(MongoDbStreamingChangeEventSource.java:115) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.execute(MongoDbStreamingChangeEventSource.java:96) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.execute(MongoDbStreamingChangeEventSource.java:52) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:174) at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:141) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:109) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: com.mongodb.MongoCommandException: Command failed with error 136: 'CappedPositionLost: CollectionScan died due to position in capped collection being deleted.' on server xxxxxxx.docdb.amazonaws.com:27017. The full response is {"ok": 0.0, "operationTime": {"$timestamp": {"t": 1659445789, "i": 1}}, "code": 136, "errmsg": "CappedPositionLost: CollectionScan died due to position in capped collection being deleted."} at com.mongodb.internal.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:198) at com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:413) at com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:337) at com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:116) at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:644) at com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:71) at com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:240) at com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:226) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:126) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:116) at com.mongodb.internal.connection.DefaultServer$OperationCountTrackingConnection.command(DefaultServer.java:345) at com.mongodb.internal.operation.CommandOperationHelper.createReadCommandAndExecute(CommandOperationHelper.java:232) at com.mongodb.internal.operation.CommandOperationHelper.lambda$executeRetryableRead$4(CommandOperationHelper.java:214) at com.mongodb.internal.operation.OperationHelper.lambda$withSourceAndConnection$2(OperationHelper.java:575) at com.mongodb.internal.operation.OperationHelper.withSuppliedResource(OperationHelper.java:600) at com.mongodb.internal.operation.OperationHelper.lambda$withSourceAndConnection$3(OperationHelper.java:574) at com.mongodb.internal.operation.OperationHelper.withSuppliedResource(OperationHelper.java:600) at com.mongodb.internal.operation.OperationHelper.withSourceAndConnection(OperationHelper.java:573) at com.mongodb.internal.operation.CommandOperationHelper.lambda$executeRetryableRead$5(CommandOperationHelper.java:211) at com.mongodb.internal.async.function.RetryingSyncSupplier.get(RetryingSyncSupplier.java:65) at com.mongodb.internal.operation.CommandOperationHelper.executeRetryableRead(CommandOperationHelper.java:217) at com.mongodb.internal.operation.CommandOperationHelper.executeRetryableRead(CommandOperationHelper.java:197) at com.mongodb.internal.operation.AggregateOperationImpl.execute(AggregateOperationImpl.java:195) at com.mongodb.internal.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:347) at com.mongodb.internal.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:343) at com.mongodb.internal.operation.OperationHelper.withReadConnectionSource(OperationHelper.java:538) at com.mongodb.internal.operation.ChangeStreamOperation.execute(ChangeStreamOperation.java:343) at com.mongodb.internal.operation.ChangeStreamOperation.execute(ChangeStreamOperation.java:58) at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:191) at com.mongodb.client.internal.ChangeStreamIterableImpl.execute(ChangeStreamIterableImpl.java:221) at com.mongodb.client.internal.ChangeStreamIterableImpl.cursor(ChangeStreamIterableImpl.java:174) at com.mongodb.client.internal.ChangeStreamIterableImpl.iterator(ChangeStreamIterableImpl.java:169) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.readChangeStream(MongoDbStreamingChangeEventSource.java:233) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$streamChangesForReplicaSet$0(MongoDbStreamingChangeEventSource.java:116) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:282) ... 11 more [2022-08-02 13:09:49,286] ERROR [document_db_connectorII|task-0] Producer failure (io.debezium.pipeline.ErrorHandler:35) org.apache.kafka.connect.errors.ConnectException: Error while attempting to read from change stream on 'rs0/xxxxxxx.docdb.amazonaws.com:27017' at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$establishConnectionToPrimary$3(MongoDbStreamingChangeEventSource.java:165) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:286) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.streamChangesForReplicaSet(MongoDbStreamingChangeEventSource.java:115) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.execute(MongoDbStreamingChangeEventSource.java:96) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.execute(MongoDbStreamingChangeEventSource.java:52) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:174) at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:141) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:109) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: com.mongodb.MongoCommandException: Command failed with error 136: 'CappedPositionLost: CollectionScan died due to position in capped collection being deleted.' on server xxxxxxx.docdb.amazonaws.com:27017. The full response is {"ok": 0.0, "operationTime": {"$timestamp": {"t": 1659445789, "i": 1}}, "code": 136, "errmsg": "CappedPositionLost: CollectionScan died due to position in capped collection being deleted."} at com.mongodb.internal.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:198) at com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:413) at com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:337) at com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:116) at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:644) at com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:71) at com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:240) at com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:226) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:126) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:116) at com.mongodb.internal.connection.DefaultServer$OperationCountTrackingConnection.command(DefaultServer.java:345) at com.mongodb.internal.operation.CommandOperationHelper.createReadCommandAndExecute(CommandOperationHelper.java:232) at com.mongodb.internal.operation.CommandOperationHelper.lambda$executeRetryableRead$4(CommandOperationHelper.java:214) at com.mongodb.internal.operation.OperationHelper.lambda$withSourceAndConnection$2(OperationHelper.java:575) at com.mongodb.internal.operation.OperationHelper.withSuppliedResource(OperationHelper.java:600) at com.mongodb.internal.operation.OperationHelper.lambda$withSourceAndConnection$3(OperationHelper.java:574) at com.mongodb.internal.operation.OperationHelper.withSuppliedResource(OperationHelper.java:600) at com.mongodb.internal.operation.OperationHelper.withSourceAndConnection(OperationHelper.java:573) at com.mongodb.internal.operation.CommandOperationHelper.lambda$executeRetryableRead$5(CommandOperationHelper.java:211) at com.mongodb.internal.async.function.RetryingSyncSupplier.get(RetryingSyncSupplier.java:65) at com.mongodb.internal.operation.CommandOperationHelper.executeRetryableRead(CommandOperationHelper.java:217) at com.mongodb.internal.operation.CommandOperationHelper.executeRetryableRead(CommandOperationHelper.java:197) at com.mongodb.internal.operation.AggregateOperationImpl.execute(AggregateOperationImpl.java:195) at com.mongodb.internal.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:347) at com.mongodb.internal.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:343) at com.mongodb.internal.operation.OperationHelper.withReadConnectionSource(OperationHelper.java:538) at com.mongodb.internal.operation.ChangeStreamOperation.execute(ChangeStreamOperation.java:343) at com.mongodb.internal.operation.ChangeStreamOperation.execute(ChangeStreamOperation.java:58) at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:191) at com.mongodb.client.internal.ChangeStreamIterableImpl.execute(ChangeStreamIterableImpl.java:221) at com.mongodb.client.internal.ChangeStreamIterableImpl.cursor(ChangeStreamIterableImpl.java:174) at com.mongodb.client.internal.ChangeStreamIterableImpl.iterator(ChangeStreamIterableImpl.java:169) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.readChangeStream(MongoDbStreamingChangeEventSource.java:233) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$streamChangesForReplicaSet$0(MongoDbStreamingChangeEventSource.java:116) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:282) ... 11 more [2022-08-02 13:09:49,287] INFO [document_db_connectorII|task-0] Closing all connections to rs0/xxxxxxx.docdb.amazonaws.com:27017 (io.debezium.connector.mongodb.ConnectionContext:100) [2022-08-02 13:09:49,290] INFO [document_db_connectorII|task-0] Finished streaming (io.debezium.pipeline.ChangeEventSourceCoordinator:175) [2022-08-02 13:09:49,290] INFO [document_db_connectorII|task-0] Connected metrics set to 'false' (io.debezium.pipeline.ChangeEventSourceCoordinator:236) [2022-08-02 13:09:49,592] ERROR [document_db_connectorII|task-0] WorkerSourceTask{id=document_db_connectorII-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:207) org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:50) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.streamChangesForReplicaSet(MongoDbStreamingChangeEventSource.java:122) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.execute(MongoDbStreamingChangeEventSource.java:96) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.execute(MongoDbStreamingChangeEventSource.java:52) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:174) at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:141) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:109) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: org.apache.kafka.connect.errors.ConnectException: Error while attempting to read from change stream on 'rs0/xxxxxxx.docdb.amazonaws.com:27017' at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$establishConnectionToPrimary$3(MongoDbStreamingChangeEventSource.java:165) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:286) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.streamChangesForReplicaSet(MongoDbStreamingChangeEventSource.java:115) ... 10 more Caused by: com.mongodb.MongoCommandException: Command failed with error 136: 'CappedPositionLost: CollectionScan died due to position in capped collection being deleted.' on server xxxxxxx.docdb.amazonaws.com:27017. The full response is {"ok": 0.0, "operationTime": {"$timestamp": {"t": 1659445789, "i": 1}}, "code": 136, "errmsg": "CappedPositionLost: CollectionScan died due to position in capped collection being deleted."} at com.mongodb.internal.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:198) at com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:413) at com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:337) at com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:116) at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:644) at com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:71) at com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:240) at com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:226) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:126) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:116) at com.mongodb.internal.connection.DefaultServer$OperationCountTrackingConnection.command(DefaultServer.java:345) at com.mongodb.internal.operation.CommandOperationHelper.createReadCommandAndExecute(CommandOperationHelper.java:232) at com.mongodb.internal.operation.CommandOperationHelper.lambda$executeRetryableRead$4(CommandOperationHelper.java:214) at com.mongodb.internal.operation.OperationHelper.lambda$withSourceAndConnection$2(OperationHelper.java:575) at com.mongodb.internal.operation.OperationHelper.withSuppliedResource(OperationHelper.java:600) at com.mongodb.internal.operation.OperationHelper.lambda$withSourceAndConnection$3(OperationHelper.java:574) at com.mongodb.internal.operation.OperationHelper.withSuppliedResource(OperationHelper.java:600) at com.mongodb.internal.operation.OperationHelper.withSourceAndConnection(OperationHelper.java:573) at com.mongodb.internal.operation.CommandOperationHelper.lambda$executeRetryableRead$5(CommandOperationHelper.java:211) at com.mongodb.internal.async.function.RetryingSyncSupplier.get(RetryingSyncSupplier.java:65) at com.mongodb.internal.operation.CommandOperationHelper.executeRetryableRead(CommandOperationHelper.java:217) at com.mongodb.internal.operation.CommandOperationHelper.executeRetryableRead(CommandOperationHelper.java:197) at com.mongodb.internal.operation.AggregateOperationImpl.execute(AggregateOperationImpl.java:195) at com.mongodb.internal.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:347) at com.mongodb.internal.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:343) at com.mongodb.internal.operation.OperationHelper.withReadConnectionSource(OperationHelper.java:538) at com.mongodb.internal.operation.ChangeStreamOperation.execute(ChangeStreamOperation.java:343) at com.mongodb.internal.operation.ChangeStreamOperation.execute(ChangeStreamOperation.java:58) at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:191) at com.mongodb.client.internal.ChangeStreamIterableImpl.execute(ChangeStreamIterableImpl.java:221) at com.mongodb.client.internal.ChangeStreamIterableImpl.cursor(ChangeStreamIterableImpl.java:174) at com.mongodb.client.internal.ChangeStreamIterableImpl.iterator(ChangeStreamIterableImpl.java:169) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.readChangeStream(MongoDbStreamingChangeEventSource.java:233) at io.debezium.connector.mongodb.MongoDbStreamingChangeEventSource.lambda$streamChangesForReplicaSet$0(MongoDbStreamingChangeEventSource.java:116) at io.debezium.connector.mongodb.ConnectionContext$MongoPrimary.execute(ConnectionContext.java:282) ... 11 more [2022-08-02 13:09:49,593] INFO [document_db_connectorII|task-0] Stopping down connector (io.debezium.connector.common.BaseSourceTask:238) [2022-08-02 13:09:49,594] INFO [document_db_connectorII|task-0] [Producer clientId=connector-producer-document_db_connectorII-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1249) [2022-08-02 13:09:49,596] INFO [document_db_connectorII|task-0] Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:676) [2022-08-02 13:09:49,596] INFO [document_db_connectorII|task-0] Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:680) [2022-08-02 13:09:49,596] INFO [document_db_connectorII|task-0] Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:686) [2022-08-02 13:09:49,596] INFO [document_db_connectorII|task-0] App info kafka.producer for connector-producer-document_db_connectorII-0 unregistered (org.apache.kafka.common.utils.AppInfoParser:83) [2022-08-02 13:10:19,018] INFO [document_db_connectorII|worker] Checking current members of replica set at xxxxxxx.docdb.amazonaws.com:27017 (io.debezium.connector.mongodb.ReplicaSetDiscovery:91) [2022-08-02 13:10:49,018] INFO [document_db_connectorII|worker] Checking current members of replica set at xxxxxxx.docdb.amazonaws.com:27017 (io.debezium.connector.mongodb.ReplicaSetDiscovery:91) [2022-08-02 13:11:18,206] INFO [Consumer clientId=consumer-connect-cluster-3, groupId=connect-cluster] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047) [2022-08-02 13:11:18,288] INFO [Worker clientId=connect-1, groupId=connect-cluster] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047) [2022-08-02 13:11:18,292] INFO [Consumer clientId=consumer-connect-cluster-2, groupId=connect-cluster] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047) [2022-08-02 13:11:18,405] INFO [Consumer clientId=consumer-connect-cluster-1, groupId=connect-cluster] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047) [2022-08-02 13:11:19,021] INFO [document_db_connectorII|worker] Checking current members of replica set at xxxxxxx.docdb.amazonaws.com:27017 (io.debezium.connector.mongodb.ReplicaSetDiscovery:91) [2022-08-02 13:11:19,153] INFO [Producer clientId=producer-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047) [2022-08-02 13:11:19,155] INFO [Producer clientId=producer-3] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047) [2022-08-02 13:11:19,719] INFO [Producer clientId=producer-2] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047) [2022-08-02 13:11:49,018] INFO [document_db_connectorII|worker] Checking current members of replica set at xxxxxxx.docdb.amazonaws.com:27017 (io.debezium.connector.mongodb.ReplicaSetDiscovery:91)