Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-2392

Failing to read schema (step 5 Reading structure of captured tables) - Connection closed

    XMLWordPrintable

Details

    • Bug
    • Resolution: Done
    • Critical
    • None
    • 1.2.1.Final
    • sqlserver-connector
    • None

    Description

      I have troubles using the latest version of debiezium connector to ingest an sqlserver database into Kafka. 

      The connector config looks like this, scrambled heavily, only thing that is not scrambled is the connector name and some other minor props (if I have forgot to scramble something, please tell me or remove the sensitive parts):

      {
          "name": "AX-ConnectorV8",
          "config": {
              "connector.class": "io.debezium.connector.sqlserver.SqlServerConnector",
              "tasks.max": "1",
              "database.server.name": "ax",
              "database.hostname": "[omitted]",
              "database.port": "1433",
              "database.user": "[omitted]",
              "database.password": "[omitted]",
              "database.dbname": "[omitted]",
              "table.whitelist": "[omitted]"
              "database.history.kafka.bootstrap.servers": "[omitted]",
              "database.history.kafka.topic": "dbhistory.ax",
              "database.history.producer.ssl.endpoint.identification.algorithm": "https",
              "database.history.producer.sasl.mechanism": "PLAIN",
              "database.history.producer.request.timeout.ms": "60000",
              "database.history.producer.retry.backoff.ms": "1000",
              "database.history.producer.sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"[omitted]\" password=\"[omitted]\";",
              "database.history.producer.security.protocol": "SASL_SSL",
              "database.history.producer.compression.type": "snappy",
              "database.history.consumer.ssl.endpoint.identification.algorithm": "https",
              "database.history.consumer.sasl.mechanism": "PLAIN",
              "database.history.consumer.request.timeout.ms": "60000",
              "database.history.consumer.retry.backoff.ms": "1000",
              "database.history.consumer.sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"[omitted]\" password=\"[omitted]\";",
              "database.history.consumer.security.protocol": "SASL_SSL",
              "transforms": "unwrap",
              "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
              "transforms.unwrap.drop.tombstones": "false",
              "key.converter": "io.confluent.connect.avro.AvroConverter",
              "key.converter.schema.registry.url": "[omitted]",
              "key.converter.basic.auth.credentials.source": "USER_INFO",
              "key.converter.schema.registry.basic.auth.user.info": "[omitted]"
              "value.converter": "io.confluent.connect.avro.AvroConverter",
              "value.converter.schema.registry.url": "[omitted]",
              "value.converter.basic.auth.credentials.source": "USER_INFO",
              "value.converter.schema.registry.basic.auth.user.info": "[omitted]"
          }
      }
      

      The error message I get when deploying this connector to 1.2.1 is

      [2020-07-30 10:05:45,755] INFO Kafka startTimeMs: 1596103545755 (org.apache.kafka.common.utils.AppInfoParser)
      [2020-07-30 10:05:45,774] INFO Starting SqlServerConnectorTask with configuration: (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    connector.class = io.debezium.connector.sqlserver.SqlServerConnector (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.producer.request.timeout.ms = 60000 (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    tasks.max = 1 (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.consumer.sasl.jaas.config = ******** (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.kafka.topic = dbhistory.ax (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    transforms = unwrap (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    value.converter.schema.registry.basic.auth.user.info = [omitted] (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.consumer.retry.backoff.ms = 1000 (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.producer.compression.type = snappy (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.consumer.security.protocol = SASL_SSL (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    table.whitelist = [lots of tables here... omitted for security reasons] (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    value.converter.basic.auth.credentials.source = USER_INFO (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.consumer.ssl.endpoint.identification.algorithm = https (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.producer.retry.backoff.ms = 1000 (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    transforms.unwrap.drop.tombstones = false (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    transforms.unwrap.type = io.debezium.transforms.ExtractNewRecordState (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    value.converter = io.confluent.connect.avro.AvroConverter (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    key.converter = io.confluent.connect.avro.AvroConverter (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.producer.sasl.mechanism = PLAIN (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.producer.sasl.jaas.config = ******** (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.user = STU-AG01-read (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.dbname = axProd (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.producer.ssl.endpoint.identification.algorithm = https (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.consumer.request.timeout.ms = 60000 (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.producer.security.protocol = SASL_SSL (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.kafka.bootstrap.servers = [omitted] (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.server.name = ax (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.port = 1433 (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    value.converter.schema.registry.url = [omitted] (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    task.class = io.debezium.connector.sqlserver.SqlServerConnectorTask (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.hostname = [omitted] (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.password = ******** (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    name = AX-ConnectorV8 (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    key.converter.schema.registry.basic.auth.user.info = [omitted] (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    database.history.consumer.sasl.mechanism = PLAIN (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    key.converter.schema.registry.url = [omitted] (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,781] INFO    key.converter.basic.auth.credentials.source = USER_INFO (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:05:45,812] INFO [Producer clientId=connector-producer-AX-ConnectorV8-0] Cluster ID: [omitted] (org.apache.kafka.clients.Metadata)
      [2020-07-30 10:05:46,444] INFO KafkaDatabaseHistory Consumer config: {security.protocol=SASL_SSL, enable.auto.commit=false, value.deserializer=org.apache.kafka.common.serialization.StringDeserializer, group.id=ax-dbhistory, retry.backoff.ms=1000, auto.offset.reset=earliest, request.timeout.ms=60000, session.timeout.ms=10000, bootstrap.servers=[omitted], ssl.endpoint.identification.algorithm=https, sasl.jaas.config=********, client.id=ax-dbhistory, key.deserializer=org.apache.kafka.common.serialization.StringDeserializer, sasl.mechanism=PLAIN, fetch.min.bytes=1} (io.debezium.relational.history.KafkaDatabaseHistory)
      [2020-07-30 10:05:46,452] INFO KafkaDatabaseHistory Producer config: {security.protocol=SASL_SSL, retry.backoff.ms=1000, request.timeout.ms=60000, bootstrap.servers=pkc-lz6r3.northeurope.azure.confluent.cloud:9092, ssl.endpoint.identification.algorithm=https, value.serializer=org.apache.kafka.common.serialization.StringSerializer, sasl.jaas.config=********, buffer.memory=1048576, retries=1, key.serializer=org.apache.kafka.common.serialization.StringSerializer, linger.ms=0, client.id=ax-dbhistory, sasl.mechanism=PLAIN, batch.size=32768, max.block.ms=10000, acks=1, compression.type=snappy} (io.debezium.relational.history.KafkaDatabaseHistory)
      [2020-07-30 10:05:46,463] INFO Requested thread factory for connector SqlServerConnector, id = ax named = db-history-config-check (io.debezium.util.Threads)
      [2020-07-30 10:05:46,474] INFO ProducerConfig values:
      	acks = 1
      	batch.size = 32768
      	bootstrap.servers = [omitted]
      	buffer.memory = 1048576
      	client.dns.lookup = default
      	client.id = ax-dbhistory
      	compression.type = snappy
      	connections.max.idle.ms = 540000
      	delivery.timeout.ms = 120000
      	enable.idempotence = false
      	interceptor.classes = []
      	key.serializer = class org.apache.kafka.common.serialization.StringSerializer
      	linger.ms = 0
      	max.block.ms = 10000
      	max.in.flight.requests.per.connection = 5
      	max.request.size = 1048576
      	metadata.max.age.ms = 300000
      	metadata.max.idle.ms = 300000
      	metric.reporters = []
      	metrics.num.samples = 2
      	metrics.recording.level = INFO
      	metrics.sample.window.ms = 30000
      	partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
      	receive.buffer.bytes = 32768
      	reconnect.backoff.max.ms = 1000
      	reconnect.backoff.ms = 50
      	request.timeout.ms = 60000
      	retries = 1
      	retry.backoff.ms = 1000
      	sasl.client.callback.handler.class = null
      	sasl.jaas.config = [hidden]
      	sasl.kerberos.kinit.cmd = /usr/bin/kinit
      	sasl.kerberos.min.time.before.relogin = 60000
      	sasl.kerberos.service.name = null
      	sasl.kerberos.ticket.renew.jitter = 0.05
      	sasl.kerberos.ticket.renew.window.factor = 0.8
      	sasl.login.callback.handler.class = null
      	sasl.login.class = null
      	sasl.login.refresh.buffer.seconds = 300
      	sasl.login.refresh.min.period.seconds = 60
      	sasl.login.refresh.window.factor = 0.8
      	sasl.login.refresh.window.jitter = 0.05
      	sasl.mechanism = PLAIN
      	security.protocol = SASL_SSL
      	security.providers = null
      	send.buffer.bytes = 131072
      	ssl.cipher.suites = null
      	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
      	ssl.endpoint.identification.algorithm = https
      	ssl.key.password = null
      	ssl.keymanager.algorithm = SunX509
      	ssl.keystore.location = null
      	ssl.keystore.password = null
      	ssl.keystore.type = JKS
      	ssl.protocol = TLS
      	ssl.provider = null
      	ssl.secure.random.implementation = null
      	ssl.trustmanager.algorithm = PKIX
      	ssl.truststore.location = null
      	ssl.truststore.password = null
      	ssl.truststore.type = JKS
      	transaction.timeout.ms = 60000
      	transactional.id = null
      	value.serializer = class org.apache.kafka.common.serialization.StringSerializer
       (org.apache.kafka.clients.producer.ProducerConfig)
      [2020-07-30 10:05:46,488] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser)
      [2020-07-30 10:05:46,489] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser)
      [2020-07-30 10:05:46,489] INFO Kafka startTimeMs: 1596103546488 (org.apache.kafka.common.utils.AppInfoParser)
      [2020-07-30 10:05:46,490] INFO ConsumerConfig values:
      	allow.auto.create.topics = true
      	auto.commit.interval.ms = 5000
      	auto.offset.reset = earliest
      	bootstrap.servers = [omitted]
      	check.crcs = true
      	client.dns.lookup = default
      	client.id = ax-dbhistory
      	client.rack =
      	connections.max.idle.ms = 540000
      	default.api.timeout.ms = 60000
      	enable.auto.commit = false
      	exclude.internal.topics = true
      	fetch.max.bytes = 52428800
      	fetch.max.wait.ms = 500
      	fetch.min.bytes = 1
      	group.id = ax-dbhistory
      	group.instance.id = null
      	heartbeat.interval.ms = 3000
      	interceptor.classes = []
      	internal.leave.group.on.close = true
      	isolation.level = read_uncommitted
      	key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
      	max.partition.fetch.bytes = 1048576
      	max.poll.interval.ms = 300000
      	max.poll.records = 500
      	metadata.max.age.ms = 300000
      	metric.reporters = []
      	metrics.num.samples = 2
      	metrics.recording.level = INFO
      	metrics.sample.window.ms = 30000
      	partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
      	receive.buffer.bytes = 65536
      	reconnect.backoff.max.ms = 1000
      	reconnect.backoff.ms = 50
      	request.timeout.ms = 60000
      	retry.backoff.ms = 1000
      	sasl.client.callback.handler.class = null
      	sasl.jaas.config = [hidden]
      	sasl.kerberos.kinit.cmd = /usr/bin/kinit
      	sasl.kerberos.min.time.before.relogin = 60000
      	sasl.kerberos.service.name = null
      	sasl.kerberos.ticket.renew.jitter = 0.05
      	sasl.kerberos.ticket.renew.window.factor = 0.8
      	sasl.login.callback.handler.class = null
      	sasl.login.class = null
      	sasl.login.refresh.buffer.seconds = 300
      	sasl.login.refresh.min.period.seconds = 60
      	sasl.login.refresh.window.factor = 0.8
      	sasl.login.refresh.window.jitter = 0.05
      	sasl.mechanism = PLAIN
      	security.protocol = SASL_SSL
      	security.providers = null
      	send.buffer.bytes = 131072
      	session.timeout.ms = 10000
      	ssl.cipher.suites = null
      	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
      	ssl.endpoint.identification.algorithm = https
      	ssl.key.password = null
      	ssl.keymanager.algorithm = SunX509
      	ssl.keystore.location = null
      	ssl.keystore.password = null
      	ssl.keystore.type = JKS
      	ssl.protocol = TLS
      	ssl.provider = null
      	ssl.secure.random.implementation = null
      	ssl.trustmanager.algorithm = PKIX
      	ssl.truststore.location = null
      	ssl.truststore.password = null
      	ssl.truststore.type = JKS
      	value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
       (org.apache.kafka.clients.consumer.ConsumerConfig)
      [2020-07-30 10:05:46,530] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser)
      [2020-07-30 10:05:46,537] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser)
      [2020-07-30 10:05:46,537] INFO Kafka startTimeMs: 1596103546530 (org.apache.kafka.common.utils.AppInfoParser)
      [2020-07-30 10:05:46,615] INFO [Consumer clientId=ax-dbhistory, groupId=ax-dbhistory] Cluster ID: [omitted] (org.apache.kafka.clients.Metadata)
      [2020-07-30 10:05:46,644] INFO [Producer clientId=ax-dbhistory] Cluster ID: [omitted] (org.apache.kafka.clients.Metadata)
      [2020-07-30 10:05:46,988] INFO Requested thread factory for connector SqlServerConnector, id = ax named = change-event-source-coordinator (io.debezium.util.Threads)
      [2020-07-30 10:05:46,988] INFO Creating thread debezium-sqlserverconnector-ax-change-event-source-coordinator (io.debezium.util.Threads)
      [2020-07-30 10:05:46,988] INFO WorkerSourceTask{id=AX-ConnectorV8-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:05:46,991] INFO Metrics registered (io.debezium.pipeline.ChangeEventSourceCoordinator)
      [2020-07-30 10:05:46,991] INFO Context created (io.debezium.pipeline.ChangeEventSourceCoordinator)
      [2020-07-30 10:05:46,995] INFO No previous offset has been found (io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource)
      [2020-07-30 10:05:46,995] INFO According to the connector configuration both schema and data will be snapshotted (io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource)
      [2020-07-30 10:05:46,995] INFO Snapshot step 1 - Preparing (io.debezium.relational.RelationalSnapshotChangeEventSource)
      [2020-07-30 10:05:46,997] INFO Snapshot step 2 - Determining captured tables (io.debezium.relational.RelationalSnapshotChangeEventSource)
      [2020-07-30 10:05:47,588] INFO Snapshot step 3 - Locking captured tables (io.debezium.relational.RelationalSnapshotChangeEventSource)
      [2020-07-30 10:05:47,589] INFO Setting locking timeout to 10 s (io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource)
      [2020-07-30 10:05:47,750] INFO Executing schema locking (io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource)
      [2020-07-30 10:05:47,751] INFO Locking table [omitted] (io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource)
      .... Omitted locking tables lines
      [2020-07-30 10:05:49,397] INFO Snapshot step 4 - Determining snapshot offset (io.debezium.relational.RelationalSnapshotChangeEventSource)
      [2020-07-30 10:05:49,438] INFO Snapshot step 5 - Reading structure of captured tables (io.debezium.relational.RelationalSnapshotChangeEventSource)
      [2020-07-30 10:05:49,438] INFO Reading structure of schema '[omitted]' (io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource)
      [2020-07-30 10:05:55,773] INFO WorkerSourceTask{id=AX-ConnectorV8-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:05:55,774] INFO WorkerSourceTask{id=AX-ConnectorV8-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:05,774] INFO WorkerSourceTask{id=AX-ConnectorV8-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:05,774] INFO WorkerSourceTask{id=AX-ConnectorV8-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:15,775] INFO WorkerSourceTask{id=AX-ConnectorV8-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:15,775] INFO WorkerSourceTask{id=AX-ConnectorV8-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:22,772] INFO Snapshot - Final stage (io.debezium.pipeline.source.AbstractSnapshotChangeEventSource)
      [2020-07-30 10:06:23,016] INFO Removing locking timeout (io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource)
      [2020-07-30 10:06:23,055] ERROR Producer failure (io.debezium.pipeline.ErrorHandler)
      java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: The connection is closed.
      	at io.debezium.relational.RelationalSnapshotChangeEventSource.rollbackTransaction(RelationalSnapshotChangeEventSource.java:435)
      	at io.debezium.relational.RelationalSnapshotChangeEventSource.doExecute(RelationalSnapshotChangeEventSource.java:152)
      	at io.debezium.pipeline.source.AbstractSnapshotChangeEventSource.execute(AbstractSnapshotChangeEventSource.java:63)
      	at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:96)
      	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      	at java.lang.Thread.run(Thread.java:748)
      Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The connection is closed.
      	at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:234)
      	at com.microsoft.sqlserver.jdbc.SQLServerConnection.checkClosed(SQLServerConnection.java:1088)
      	at com.microsoft.sqlserver.jdbc.SQLServerConnection.rollback(SQLServerConnection.java:3153)
      	at io.debezium.relational.RelationalSnapshotChangeEventSource.rollbackTransaction(RelationalSnapshotChangeEventSource.java:432)
      	... 8 more
      [2020-07-30 10:06:23,055] INFO Connected metrics set to 'false' (io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics)
      [2020-07-30 10:06:23,488] INFO WorkerSourceTask{id=AX-ConnectorV8-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:23,491] INFO WorkerSourceTask{id=AX-ConnectorV8-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:23,491] ERROR WorkerSourceTask{id=AX-ConnectorV8-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask)
      org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
      	at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42)
      	at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:117)
      	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      	at java.lang.Thread.run(Thread.java:748)
      Caused by: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: The connection is closed.
      	at io.debezium.relational.RelationalSnapshotChangeEventSource.rollbackTransaction(RelationalSnapshotChangeEventSource.java:435)
      	at io.debezium.relational.RelationalSnapshotChangeEventSource.doExecute(RelationalSnapshotChangeEventSource.java:152)
      	at io.debezium.pipeline.source.AbstractSnapshotChangeEventSource.execute(AbstractSnapshotChangeEventSource.java:63)
      	at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:96)
      	... 5 more
      Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The connection is closed.
      	at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:234)
      	at com.microsoft.sqlserver.jdbc.SQLServerConnection.checkClosed(SQLServerConnection.java:1088)
      	at com.microsoft.sqlserver.jdbc.SQLServerConnection.rollback(SQLServerConnection.java:3153)
      	at io.debezium.relational.RelationalSnapshotChangeEventSource.rollbackTransaction(RelationalSnapshotChangeEventSource.java:432)
      	... 8 more
      [2020-07-30 10:06:23,491] ERROR WorkerSourceTask{id=AX-ConnectorV8-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask)
      [2020-07-30 10:06:23,491] INFO Stopping down connector (io.debezium.connector.common.BaseSourceTask)
      [2020-07-30 10:06:23,492] INFO [Producer clientId=ax-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer)
      [2020-07-30 10:06:23,496] INFO [Producer clientId=connector-producer-AX-ConnectorV8-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer)
      [2020-07-30 10:06:25,775] INFO WorkerSourceTask{id=AX-ConnectorV8-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:25,775] INFO WorkerSourceTask{id=AX-ConnectorV8-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:35,775] INFO WorkerSourceTask{id=AX-ConnectorV8-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask)
      [2020-07-30 10:06:35,775] INFO WorkerSourceTask{id=AX-ConnectorV8-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask)
      

      Can anyone help me figuring out what the flip is going on? :/ Any help is appreciated! 

      Attachments

        Activity

          People

            Unassigned Unassigned
            jarlandreh Jarl André Hübenthal (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: