Plugins are loaded from /kafka/connect Using the following environment variables: GROUP_ID=sysint-kafka-connect CONFIG_STORAGE_TOPIC=_sysint_connect_configs OFFSET_STORAGE_TOPIC=_sysint_connect_offsets STATUS_STORAGE_TOPIC=_sysint_connect_status BOOTSTRAP_SERVERS=kafka:29093 REST_HOST_NAME=172.18.0.6 REST_PORT=8083 ADVERTISED_HOST_NAME=172.18.0.6 ADVERTISED_PORT=8083 KEY_CONVERTER=org.apache.kafka.connect.json.JsonConverter VALUE_CONVERTER=org.apache.kafka.connect.json.JsonConverter INTERNAL_KEY_CONVERTER=org.apache.kafka.connect.json.JsonConverter INTERNAL_VALUE_CONVERTER=org.apache.kafka.connect.json.JsonConverter OFFSET_FLUSH_INTERVAL_MS=5000 OFFSET_FLUSH_TIMEOUT_MS=5000 SHUTDOWN_TIMEOUT=10000 --- Setting property from CONNECT_INTERNAL_VALUE_CONVERTER: internal.value.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_VALUE_CONVERTER: value.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_REST_ADVERTISED_HOST_NAME: rest.advertised.host.name=172.18.0.6 --- Setting property from CONNECT_OFFSET_FLUSH_INTERVAL_MS: offset.flush.interval.ms=5000 --- Setting property from CONNECT_GROUP_ID: group.id=sysint-kafka-connect --- Setting property from CONNECT_BOOTSTRAP_SERVERS: bootstrap.servers=kafka:29093 --- Setting property from CONNECT_KEY_CONVERTER: key.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_TASK_SHUTDOWN_GRACEFUL_TIMEOUT_MS: task.shutdown.graceful.timeout.ms=10000 --- Setting property from CONNECT_REST_HOST_NAME: rest.host.name=172.18.0.6 --- Setting property from CONNECT_PLUGIN_PATH: plugin.path=/kafka/connect --- Setting property from CONNECT_REST_PORT: rest.port=8083 --- Setting property from CONNECT_OFFSET_FLUSH_TIMEOUT_MS: offset.flush.timeout.ms=5000 --- Setting property from CONNECT_STATUS_STORAGE_TOPIC: status.storage.topic=_sysint_connect_status --- Setting property from CONNECT_INTERNAL_KEY_CONVERTER: internal.key.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_CONFIG_STORAGE_TOPIC: config.storage.topic=_sysint_connect_configs --- Setting property from CONNECT_REST_ADVERTISED_PORT: rest.advertised.port=8083 --- Setting property from CONNECT_OFFSET_STORAGE_TOPIC: offset.storage.topic=_sysint_connect_offsets 2020-10-16 09:08:45,730 INFO || WorkerInfo values: jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dcom.sun.management.jmxremote.port=1976, -Dkafka.logs.dir=/kafka/bin/../logs, -Dlog4j.configuration=file:/kafka/config/log4j.properties, -javaagent:/kafka/jmx_prometheus_javaagent.jar=8080:/kafka/config.yml jvm.spec = Oracle Corporation, OpenJDK 64-Bit Server VM, 11.0.8, 11.0.8+10-LTS jvm.classpath = /kafka/bin/../libs/activation-1.1.1.jar:/kafka/bin/../libs/aopalliance-repackaged-2.5.0.jar:/kafka/bin/../libs/argparse4j-0.7.0.jar:/kafka/bin/../libs/audience-annotations-0.5.0.jar:/kafka/bin/../libs/avro-1.9.2.jar:/kafka/bin/../libs/common-config-5.5.1.jar:/kafka/bin/../libs/common-utils-5.5.1.jar:/kafka/bin/../libs/commons-cli-1.4.jar:/kafka/bin/../libs/commons-lang3-3.8.1.jar:/kafka/bin/../libs/connect-api-2.6.0.jar:/kafka/bin/../libs/connect-basic-auth-extension-2.6.0.jar:/kafka/bin/../libs/connect-file-2.6.0.jar:/kafka/bin/../libs/connect-json-2.6.0.jar:/kafka/bin/../libs/connect-mirror-2.6.0.jar:/kafka/bin/../libs/connect-mirror-client-2.6.0.jar:/kafka/bin/../libs/connect-runtime-2.6.0.jar:/kafka/bin/../libs/connect-transforms-2.6.0.jar:/kafka/bin/../libs/hk2-api-2.5.0.jar:/kafka/bin/../libs/hk2-locator-2.5.0.jar:/kafka/bin/../libs/hk2-utils-2.5.0.jar:/kafka/bin/../libs/jackson-annotations-2.10.2.jar:/kafka/bin/../libs/jackson-core-2.10.2.jar:/kafka/bin/../libs/jackson-databind-2.10.2.jar:/kafka/bin/../libs/jackson-dataformat-csv-2.10.2.jar:/kafka/bin/../libs/jackson-datatype-jdk8-2.10.2.jar:/kafka/bin/../libs/jackson-jaxrs-base-2.10.2.jar:/kafka/bin/../libs/jackson-jaxrs-json-provider-2.10.2.jar:/kafka/bin/../libs/jackson-module-jaxb-annotations-2.10.2.jar:/kafka/bin/../libs/jackson-module-paranamer-2.10.2.jar:/kafka/bin/../libs/jackson-module-scala_2.12-2.10.2.jar:/kafka/bin/../libs/jakarta.activation-api-1.2.1.jar:/kafka/bin/../libs/jakarta.annotation-api-1.3.4.jar:/kafka/bin/../libs/jakarta.inject-2.5.0.jar:/kafka/bin/../libs/jakarta.ws.rs-api-2.1.5.jar:/kafka/bin/../libs/jakarta.xml.bind-api-2.3.2.jar:/kafka/bin/../libs/javassist-3.22.0-CR2.jar:/kafka/bin/../libs/javassist-3.26.0-GA.jar:/kafka/bin/../libs/javax.servlet-api-3.1.0.jar:/kafka/bin/../libs/javax.ws.rs-api-2.1.1.jar:/kafka/bin/../libs/jaxb-api-2.3.0.jar:/kafka/bin/../libs/jersey-client-2.28.jar:/kafka/bin/../libs/jersey-common-2.28.jar:/kafka/bin/../libs/jersey-container-servlet-2.28.jar:/kafka/bin/../libs/jersey-container-servlet-core-2.28.jar:/kafka/bin/../libs/jersey-hk2-2.28.jar:/kafka/bin/../libs/jersey-media-jaxb-2.28.jar:/kafka/bin/../libs/jersey-server-2.28.jar:/kafka/bin/../libs/jetty-client-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-continuation-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-http-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-io-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-security-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-server-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-servlet-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-servlets-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-util-9.4.24.v20191120.jar:/kafka/bin/../libs/jopt-simple-5.0.4.jar:/kafka/bin/../libs/kafka-avro-serializer-5.5.1.jar:/kafka/bin/../libs/kafka-clients-2.6.0.jar:/kafka/bin/../libs/kafka-connect-avro-converter-5.5.1.jar:/kafka/bin/../libs/kafka-connect-avro-data-5.5.1.jar:/kafka/bin/../libs/kafka-log4j-appender-2.6.0.jar:/kafka/bin/../libs/kafka-schema-registry-client-5.5.1.jar:/kafka/bin/../libs/kafka-schema-serializer-5.5.1.jar:/kafka/bin/../libs/kafka-streams-2.6.0.jar:/kafka/bin/../libs/kafka-streams-examples-2.6.0.jar:/kafka/bin/../libs/kafka-streams-scala_2.12-2.6.0.jar:/kafka/bin/../libs/kafka-streams-test-utils-2.6.0.jar:/kafka/bin/../libs/kafka-tools-2.6.0.jar:/kafka/bin/../libs/kafka_2.12-2.6.0.jar:/kafka/bin/../libs/log4j-1.2.17.jar:/kafka/bin/../libs/lz4-java-1.7.1.jar:/kafka/bin/../libs/maven-artifact-3.6.3.jar:/kafka/bin/../libs/metrics-core-2.2.0.jar:/kafka/bin/../libs/netty-buffer-4.1.50.Final.jar:/kafka/bin/../libs/netty-codec-4.1.50.Final.jar:/kafka/bin/../libs/netty-common-4.1.50.Final.jar:/kafka/bin/../libs/netty-handler-4.1.50.Final.jar:/kafka/bin/../libs/netty-resolver-4.1.50.Final.jar:/kafka/bin/../libs/netty-transport-4.1.50.Final.jar:/kafka/bin/../libs/netty-transport-native-epoll-4.1.50.Final.jar:/kafka/bin/../libs/netty-transport-native-unix-common-4.1.50.Final.jar:/kafka/bin/../libs/osgi-resource-locator-1.0.1.jar:/kafka/bin/../libs/paranamer-2.8.jar:/kafka/bin/../libs/plexus-utils-3.2.1.jar:/kafka/bin/../libs/reflections-0.9.12.jar:/kafka/bin/../libs/rocksdbjni-5.18.4.jar:/kafka/bin/../libs/scala-collection-compat_2.12-2.1.6.jar:/kafka/bin/../libs/scala-java8-compat_2.12-0.9.1.jar:/kafka/bin/../libs/scala-library-2.12.11.jar:/kafka/bin/../libs/scala-logging_2.12-3.9.2.jar:/kafka/bin/../libs/scala-reflect-2.12.11.jar:/kafka/bin/../libs/slf4j-api-1.7.30.jar:/kafka/bin/../libs/slf4j-log4j12-1.7.30.jar:/kafka/bin/../libs/snappy-java-1.1.7.3.jar:/kafka/bin/../libs/validation-api-2.0.1.Final.jar:/kafka/bin/../libs/zookeeper-3.5.8.jar:/kafka/bin/../libs/zookeeper-jute-3.5.8.jar:/kafka/bin/../libs/zstd-jni-1.4.4-7.jar os.spec = Linux, amd64, 4.19.76-linuxkit os.vcpus = 4 [org.apache.kafka.connect.runtime.WorkerInfo] 2020-10-16 09:08:45,740 INFO || Scanning for plugin classes. This might take a moment ... [org.apache.kafka.connect.cli.ConnectDistributed] 2020-10-16 09:08:45,768 INFO || Loading plugin from: /kafka/connect/kafka-connect-insert-uuid [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:45,884 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/kafka-connect-insert-uuid/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:45,884 INFO || Added plugin 'com.github.cjmatta.kafka.connect.smt.InsertUuid$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:45,886 INFO || Added plugin 'com.github.cjmatta.kafka.connect.smt.InsertUuid$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:45,887 INFO || Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:45,887 INFO || Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:45,887 INFO || Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:45,890 INFO || Loading plugin from: /kafka/connect/debezium-connector-mongodb [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,553 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-mongodb/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,553 INFO || Added plugin 'io.debezium.connector.mongodb.MongoDbConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,554 INFO || Added plugin 'io.debezium.converters.ByteBufferConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,554 INFO || Added plugin 'io.debezium.converters.CloudEventsConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,554 INFO || Added plugin 'io.debezium.transforms.ExtractNewRecordState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,554 INFO || Added plugin 'io.debezium.transforms.outbox.EventRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,554 INFO || Added plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,554 INFO || Added plugin 'io.debezium.transforms.ByLogicalTableRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:46,554 INFO || Loading plugin from: /kafka/connect/debezium-connector-mysql [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:47,341 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-mysql/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:47,341 INFO || Added plugin 'io.debezium.connector.mysql.MySqlConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:47,357 INFO || Loading plugin from: /kafka/connect/debezium-connector-sqlserver [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:47,706 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-sqlserver/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:47,706 INFO || Added plugin 'io.debezium.connector.sqlserver.SqlServerConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:47,770 INFO || Loading plugin from: /kafka/connect/debezium-connector-postgres [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:48,075 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-postgres/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:48,079 INFO || Added plugin 'io.debezium.connector.postgresql.PostgresConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,536 INFO || Registered loader: jdk.internal.loader.ClassLoaders$AppClassLoader@3d4eac69 [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,536 INFO || Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,536 INFO || Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,536 INFO || Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,536 INFO || Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,536 INFO || Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.tools.MockConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.converters.FloatConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,537 INFO || Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,538 INFO || Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,538 INFO || Added plugin 'io.confluent.connect.avro.AvroConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,538 INFO || Added plugin 'org.apache.kafka.connect.converters.LongConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,538 INFO || Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,538 INFO || Added plugin 'org.apache.kafka.connect.json.JsonConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,544 INFO || Added plugin 'org.apache.kafka.connect.storage.StringConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.converters.ShortConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.transforms.Filter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,545 INFO || Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,546 INFO || Added plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,547 INFO || Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,547 INFO || Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,547 INFO || Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,547 INFO || Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,547 INFO || Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,547 INFO || Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,547 INFO || Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,548 INFO || Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,553 INFO || Added plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,553 INFO || Added plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,553 INFO || Added plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,553 INFO || Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,553 INFO || Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,554 INFO || Added aliases 'MongoDbConnector' and 'MongoDb' to plugin 'io.debezium.connector.mongodb.MongoDbConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,554 INFO || Added aliases 'MySqlConnector' and 'MySql' to plugin 'io.debezium.connector.mysql.MySqlConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,554 INFO || Added aliases 'PostgresConnector' and 'Postgres' to plugin 'io.debezium.connector.postgresql.PostgresConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,554 INFO || Added aliases 'SqlServerConnector' and 'SqlServer' to plugin 'io.debezium.connector.sqlserver.SqlServerConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,555 INFO || Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,555 INFO || Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,555 INFO || Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,555 INFO || Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,555 INFO || Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,555 INFO || Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,556 INFO || Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,556 INFO || Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,556 INFO || Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,560 INFO || Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,560 INFO || Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,560 INFO || Added aliases 'AvroConverter' and 'Avro' to plugin 'io.confluent.connect.avro.AvroConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,560 INFO || Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,561 INFO || Added aliases 'CloudEventsConverter' and 'CloudEvents' to plugin 'io.debezium.converters.CloudEventsConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,561 INFO || Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,561 INFO || Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,561 INFO || Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,561 INFO || Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,562 INFO || Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,562 INFO || Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,562 INFO || Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,563 INFO || Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,563 INFO || Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,563 INFO || Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,563 INFO || Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,563 INFO || Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,563 INFO || Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,564 INFO || Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,564 INFO || Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,564 INFO || Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,564 INFO || Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,565 INFO || Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,565 INFO || Added alias 'ExtractNewDocumentState' to plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,566 INFO || Added alias 'ByLogicalTableRouter' to plugin 'io.debezium.transforms.ByLogicalTableRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,566 INFO || Added alias 'ExtractNewRecordState' to plugin 'io.debezium.transforms.ExtractNewRecordState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,567 INFO || Added alias 'EventRouter' to plugin 'io.debezium.transforms.outbox.EventRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,567 INFO || Added aliases 'PredicatedTransformation' and 'Predicated' to plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,567 INFO || Added alias 'Filter' to plugin 'org.apache.kafka.connect.transforms.Filter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,567 INFO || Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,568 INFO || Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,568 INFO || Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,568 INFO || Added alias 'HasHeaderKey' to plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,568 INFO || Added alias 'RecordIsTombstone' to plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,568 INFO || Added alias 'TopicNameMatches' to plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,569 INFO || Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,570 INFO || Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,570 INFO || Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,570 INFO || Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:08:50,659 INFO || DistributedConfig values: access.control.allow.methods = access.control.allow.origin = admin.listeners = null bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = config.providers = [] config.storage.replication.factor = 1 config.storage.topic = _sysint_connect_configs connect.protocol = sessioned connections.max.idle.ms = 540000 connector.client.config.override.policy = None group.id = sysint-kafka-connect header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter heartbeat.interval.ms = 3000 inter.worker.key.generation.algorithm = HmacSHA256 inter.worker.key.size = null inter.worker.key.ttl.ms = 3600000 inter.worker.signature.algorithm = HmacSHA256 inter.worker.verification.algorithms = [HmacSHA256] internal.key.converter = class org.apache.kafka.connect.json.JsonConverter internal.value.converter = class org.apache.kafka.connect.json.JsonConverter key.converter = class org.apache.kafka.connect.json.JsonConverter listeners = null metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 5000 offset.flush.timeout.ms = 5000 offset.storage.partitions = 25 offset.storage.replication.factor = 1 offset.storage.topic = _sysint_connect_offsets plugin.path = [/kafka/connect] rebalance.timeout.ms = 60000 receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 40000 response.http.headers.config = rest.advertised.host.name = 172.18.0.6 rest.advertised.listener = null rest.advertised.port = 8083 rest.extension.classes = [] rest.host.name = 172.18.0.6 rest.port = 8083 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI scheduled.rebalance.max.delay.ms = 300000 security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS status.storage.partitions = 5 status.storage.replication.factor = 1 status.storage.topic = _sysint_connect_status task.shutdown.graceful.timeout.ms = 10000 topic.creation.enable = true topic.tracking.allow.reset = true topic.tracking.enable = true value.converter = class org.apache.kafka.connect.json.JsonConverter worker.sync.timeout.ms = 3000 worker.unsync.backoff.ms = 300000 [org.apache.kafka.connect.runtime.distributed.DistributedConfig] 2020-10-16 09:08:50,659 INFO || Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. [org.apache.kafka.connect.runtime.WorkerConfig] 2020-10-16 09:08:50,660 INFO || Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. [org.apache.kafka.connect.runtime.WorkerConfig] 2020-10-16 09:08:50,662 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:50,668 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,826 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,826 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,835 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:50,836 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:50,836 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:50,836 INFO || Kafka startTimeMs: 1602839330835 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,458 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:51,504 INFO || Logging initialized @6738ms to org.eclipse.jetty.util.log.Slf4jLog [org.eclipse.jetty.util.log] 2020-10-16 09:08:51,601 INFO || Added connector for http://172.18.0.6:8083 [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:51,601 INFO || Initializing REST server [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:51,611 INFO || jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 11.0.8+10-LTS [org.eclipse.jetty.server.Server] 2020-10-16 09:08:51,661 INFO || Started http_172.18.0.68083@6f89292e{HTTP/1.1,[http/1.1]}{172.18.0.6:8083} [org.eclipse.jetty.server.AbstractConnector] 2020-10-16 09:08:51,661 INFO || Started @6895ms [org.eclipse.jetty.server.Server] 2020-10-16 09:08:51,699 INFO || Advertised URI: http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:51,700 INFO || REST server listening at http://172.18.0.6:8083/, advertising URL http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:51,700 INFO || Advertised URI: http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:51,700 INFO || REST admin endpoints at http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:51,700 INFO || Advertised URI: http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:51,701 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:51,702 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,705 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,705 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,705 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,705 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,705 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,705 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,706 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,707 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,707 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,707 INFO || Kafka startTimeMs: 1602839331706 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,756 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:51,791 INFO || Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden [org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy] 2020-10-16 09:08:51,809 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:51,813 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,828 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,829 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,830 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,830 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,830 INFO || Kafka startTimeMs: 1602839331829 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,845 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:51,849 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,850 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,850 INFO || Kafka startTimeMs: 1602839331849 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,982 INFO || JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:08:51,983 INFO || JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:08:51,984 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:51,985 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,988 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,989 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,989 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,989 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:51,989 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,989 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:51,989 INFO || Kafka startTimeMs: 1602839331989 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,028 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:52,050 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:52,051 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,053 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,053 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,053 INFO || Kafka startTimeMs: 1602839332053 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,094 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:52,097 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:52,097 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,099 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,099 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,099 INFO || Kafka startTimeMs: 1602839332099 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,137 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:52,154 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:52,155 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,159 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,159 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,159 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,159 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,159 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,159 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,159 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,160 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,161 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,169 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,169 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,169 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,169 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,169 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,170 INFO || Kafka startTimeMs: 1602839332169 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,202 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:08:52,256 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,256 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,257 INFO || Kafka startTimeMs: 1602839332250 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,260 INFO || Kafka Connect distributed worker initialization took 6521ms [org.apache.kafka.connect.cli.ConnectDistributed] 2020-10-16 09:08:52,261 INFO || Kafka Connect starting [org.apache.kafka.connect.runtime.Connect] 2020-10-16 09:08:52,262 INFO || Initializing REST resources [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:52,263 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Herder starting [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:08:52,303 INFO || Worker starting [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:08:52,303 INFO || Starting KafkaOffsetBackingStore [org.apache.kafka.connect.storage.KafkaOffsetBackingStore] 2020-10-16 09:08:52,304 INFO || Starting KafkaBasedLog with topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:52,304 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,305 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,306 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:52,307 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,312 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,312 INFO || Kafka startTimeMs: 1602839332307 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:52,399 INFO || Adding admin resources to main listener [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:52,556 INFO || DefaultSessionIdManager workerName=node0 [org.eclipse.jetty.server.session] 2020-10-16 09:08:52,556 INFO || No SessionScavenger set, using defaults [org.eclipse.jetty.server.session] 2020-10-16 09:08:52,557 INFO || node0 Scavenging every 600000ms [org.eclipse.jetty.server.session] Oct 16, 2020 9:08:53 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource will be ignored. Oct 16, 2020 9:08:53 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Oct 16, 2020 9:08:53 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Oct 16, 2020 9:08:53 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. 2020-10-16 09:08:53,771 INFO || Created topic (name=_sysint_connect_offsets, numPartitions=25, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at kafka:29093 [org.apache.kafka.connect.util.TopicAdmin] 2020-10-16 09:08:53,788 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-1 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,838 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,839 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,840 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:53,841 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:53,841 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:53,841 INFO || Kafka startTimeMs: 1602839333840 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:53,885 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-sysint-kafka-connect-1 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = sysint-kafka-connect group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,903 INFO || [Producer clientId=producer-1] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:08:53,971 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,979 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:53,980 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:53,980 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:53,980 INFO || Kafka startTimeMs: 1602839333980 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:53,989 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:08:54,079 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Subscribed to partition(s): _sysint_connect_offsets-0, _sysint_connect_offsets-5, _sysint_connect_offsets-10, _sysint_connect_offsets-20, _sysint_connect_offsets-15, _sysint_connect_offsets-9, _sysint_connect_offsets-11, _sysint_connect_offsets-4, _sysint_connect_offsets-16, _sysint_connect_offsets-17, _sysint_connect_offsets-3, _sysint_connect_offsets-24, _sysint_connect_offsets-23, _sysint_connect_offsets-13, _sysint_connect_offsets-18, _sysint_connect_offsets-22, _sysint_connect_offsets-2, _sysint_connect_offsets-8, _sysint_connect_offsets-12, _sysint_connect_offsets-19, _sysint_connect_offsets-14, _sysint_connect_offsets-1, _sysint_connect_offsets-6, _sysint_connect_offsets-7, _sysint_connect_offsets-21 [org.apache.kafka.clients.consumer.KafkaConsumer] 2020-10-16 09:08:54,081 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-0 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,082 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-5 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-10 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-20 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-15 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-9 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-11 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-4 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-16 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-17 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-3 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,083 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-24 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-23 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-13 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-18 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-22 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-2 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-8 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-12 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-19 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-14 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-1 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,084 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-6 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,085 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-7 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,085 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-21 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,163 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-24 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,164 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-18 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,164 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-16 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,164 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-22 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-20 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-9 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-7 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-13 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-11 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-1 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-5 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-3 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-23 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-17 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,170 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-15 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-21 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-19 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-10 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-8 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-14 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-12 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-2 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-0 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-6 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,171 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-4 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,176 INFO || Finished reading KafkaBasedLog for topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,176 INFO || Started KafkaBasedLog for topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,176 INFO || Finished reading offsets topic and starting KafkaOffsetBackingStore [org.apache.kafka.connect.storage.KafkaOffsetBackingStore] 2020-10-16 09:08:54,198 INFO || Worker started [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:08:54,198 INFO || Starting KafkaBasedLog with topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,205 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,228 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,229 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,232 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,232 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,232 INFO || Kafka startTimeMs: 1602839334229 [org.apache.kafka.common.utils.AppInfoParser] Oct 16, 2020 9:08:54 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method listLoggers in org.apache.kafka.connect.runtime.rest.resources.LoggingResource contains empty path annotation. WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. 2020-10-16 09:08:54,374 INFO || Started o.e.j.s.ServletContextHandler@64d4f7c7{/,null,AVAILABLE} [org.eclipse.jetty.server.handler.ContextHandler] 2020-10-16 09:08:54,382 INFO || REST resources initialized; server is started and ready to handle requests [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:08:54,382 INFO || Kafka Connect started [org.apache.kafka.connect.runtime.Connect] 2020-10-16 09:08:54,427 INFO || Created topic (name=_sysint_connect_status, numPartitions=5, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at kafka:29093 [org.apache.kafka.connect.util.TopicAdmin] 2020-10-16 09:08:54,429 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-2 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000  retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,433 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,434 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,435 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,436 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,438 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,438 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,438 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,438 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,438 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,438 INFO || Kafka startTimeMs: 1602839334438 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,439 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-sysint-kafka-connect-2 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = sysint-kafka-connect group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,445 INFO || [Producer clientId=producer-2] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:08:54,450 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,452 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,453 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,453 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,453 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,453 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,454 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,455 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,455 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,455 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,455 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,455 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,455 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,455 INFO || Kafka startTimeMs: 1602839334455 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,466 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:08:54,487 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Subscribed to partition(s): _sysint_connect_status-0, _sysint_connect_status-4, _sysint_connect_status-1, _sysint_connect_status-2, _sysint_connect_status-3 [org.apache.kafka.clients.consumer.KafkaConsumer] 2020-10-16 09:08:54,488 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-0 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,491 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-4 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,491 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-1 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,493 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-2 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,493 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-3 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,514 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-2 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,514 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-1 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,516 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-0 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,516 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-4 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,516 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-3 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,523 INFO || Finished reading KafkaBasedLog for topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,523 INFO || Started KafkaBasedLog for topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,523 INFO || Starting KafkaConfigBackingStore [org.apache.kafka.connect.storage.KafkaConfigBackingStore] 2020-10-16 09:08:54,523 INFO || Starting KafkaBasedLog with topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,524 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:08:54,525 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,525 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,525 INFO || Kafka startTimeMs: 1602839334525 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,658 INFO || Created topic (name=_sysint_connect_configs, numPartitions=1, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at kafka:29093 [org.apache.kafka.connect.util.TopicAdmin] 2020-10-16 09:08:54,660 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-3 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,666 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,667 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,673 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:08:54,673 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,673 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,673 INFO || Kafka startTimeMs: 1602839334673 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,674 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-sysint-kafka-connect-3 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = sysint-kafka-connect group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:08:54,677 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,677 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,677 INFO || Kafka startTimeMs: 1602839334677 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:08:54,681 INFO || [Producer clientId=producer-3] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:08:54,692 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:08:54,709 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Subscribed to partition(s): _sysint_connect_configs-0 [org.apache.kafka.clients.consumer.KafkaConsumer] 2020-10-16 09:08:54,709 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_configs-0 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,718 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_configs-0 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:08:54,719 INFO || Finished reading KafkaBasedLog for topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,721 INFO || Started KafkaBasedLog for topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:08:54,721 INFO || Started KafkaConfigBackingStore [org.apache.kafka.connect.storage.KafkaConfigBackingStore] 2020-10-16 09:08:54,721 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Herder started [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:08:54,754 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:08:56,440 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Discovered group coordinator kafka:29093 (id: 2147483646 rack: null) [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:08:56,441 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Rebalance started [org.apache.kafka.connect.runtime.distributed.WorkerCoordinator] 2020-10-16 09:08:56,442 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:08:56,489 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group. [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:08:56,489 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:08:56,716 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Successfully joined group with generation 1 [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:08:56,717 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Joined group at generation 1 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-d2e0a5b7-877e-469e-b9fe-d73d9d8549d0', leaderUrl='http://172.18.0.6:8083/', offset=-1, connectorIds=[], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:08:56,718 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting connectors and tasks using config offset -1 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:08:56,718 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Finished starting connectors and tasks [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:08:56,813 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Session key updated [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:47,468 INFO || AbstractConfig values: [org.apache.kafka.common.config.AbstractConfig] 2020-10-16 09:09:47,475 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Connector sysint-sqlserver-tec-runonly-connector config updated [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:47,977 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Rebalance started [org.apache.kafka.connect.runtime.distributed.WorkerCoordinator] 2020-10-16 09:09:47,977 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:09:47,994 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Successfully joined group with generation 2 [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:09:47,995 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Joined group at generation 2 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-d2e0a5b7-877e-469e-b9fe-d73d9d8549d0', leaderUrl='http://172.18.0.6:8083/', offset=2, connectorIds=[sysint-sqlserver-tec-runonly-connector], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:47,995 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting connectors and tasks using config offset 2 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:47,997 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting connector sysint-sqlserver-tec-runonly-connector [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:48,000 INFO || Creating connector sysint-sqlserver-tec-runonly-connector of type io.debezium.connector.sqlserver.SqlServerConnector [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:48,001 INFO || SourceConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.SourceConnectorConfig] 2020-10-16 09:09:48,003 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:09:48,006 INFO || Instantiated connector sysint-sqlserver-tec-runonly-connector with version 1.3.0.Final of type class io.debezium.connector.sqlserver.SqlServerConnector [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:48,007 INFO || Finished creating connector sysint-sqlserver-tec-runonly-connector [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:48,008 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Finished starting connectors and tasks [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:48,017 INFO || SourceConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.SourceConnectorConfig] 2020-10-16 09:09:48,027 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:09:48,987 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Tasks [sysint-sqlserver-tec-runonly-connector-0] configs updated [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:49,490 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Handling task config update by restarting tasks [] [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:49,490 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Rebalance started [org.apache.kafka.connect.runtime.distributed.WorkerCoordinator] 2020-10-16 09:09:49,490 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:09:49,496 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Successfully joined group with generation 3 [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:09:49,496 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Joined group at generation 3 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-d2e0a5b7-877e-469e-b9fe-d73d9d8549d0', leaderUrl='http://172.18.0.6:8083/', offset=4, connectorIds=[sysint-sqlserver-tec-runonly-connector], taskIds=[sysint-sqlserver-tec-runonly-connector-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:49,497 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting connectors and tasks using config offset 4 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:49,497 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting task sysint-sqlserver-tec-runonly-connector-0 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:49,498 INFO || Creating task sysint-sqlserver-tec-runonly-connector-0 [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:49,499 INFO || ConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig] 2020-10-16 09:09:49,500 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:09:49,501 INFO || TaskConfig values: task.class = class io.debezium.connector.sqlserver.SqlServerConnectorTask [org.apache.kafka.connect.runtime.TaskConfig] 2020-10-16 09:09:49,501 INFO || Instantiated task sysint-sqlserver-tec-runonly-connector-0 with version 1.3.0.Final of type io.debezium.connector.sqlserver.SqlServerConnectorTask [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:49,501 INFO || JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:09:49,502 INFO || JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:09:49,502 INFO || Set up the key converter class org.apache.kafka.connect.json.JsonConverter for task sysint-sqlserver-tec-runonly-connector-0 using the connector config [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:49,502 INFO || Set up the value converter class org.apache.kafka.connect.json.JsonConverter for task sysint-sqlserver-tec-runonly-connector-0 using the connector config [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:49,502 INFO || Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task sysint-sqlserver-tec-runonly-connector-0 using the worker config [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:49,504 INFO || SourceConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.SourceConnectorConfig] 2020-10-16 09:09:49,504 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:09:49,511 INFO || Initializing: org.apache.kafka.connect.runtime.TransformationChain{io.debezium.transforms.ExtractNewRecordState, org.apache.kafka.connect.transforms.RegexRouter, com.github.cjmatta.kafka.connect.smt.InsertUuid$Value} [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:09:49,512 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = connector-producer-sysint-sqlserver-tec-runonly-connector-0 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 9223372036854775807 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 2147483647 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:09:49,514 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:09:49,514 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:09:49,514 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,515 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,515 INFO || Kafka startTimeMs: 1602839389514 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,518 INFO || [Producer clientId=connector-producer-sysint-sqlserver-tec-runonly-connector-0] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:09:49,522 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Finished starting connectors and tasks [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,526 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,529 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,529 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,529 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,529 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,529 INFO || Starting SqlServerConnectorTask with configuration: [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || connector.class = io.debezium.connector.sqlserver.SqlServerConnector [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || errors.log.include.messages = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.unwrap.delete.handling.mode = rewrite [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || tasks.max = 1 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.history.kafka.topic = it.company.sysint.data.cdc.db.history.tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms = unwrap,route,insertuuid [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || provide.transaction.metadata = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || table.whitelist = dbo.tab1,dbo.tab2 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || tombstones.on.delete = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.route.type = org.apache.kafka.connect.transforms.RegexRouter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.route.regex = (.*) [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || decimal.handling.mode = string [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.unwrap.drop.tombstones = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.unwrap.type = io.debezium.transforms.ExtractNewRecordState [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || value.converter = org.apache.kafka.connect.json.JsonConverter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || errors.log.enable = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || key.converter = org.apache.kafka.connect.json.JsonConverter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.user = sa [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.dbname = tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.history.kafka.bootstrap.servers = kafka:29093 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.insertuuid.uuid.field.name = __uuid [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.server.name = tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || snapshot.isolation.mode = read_committed [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || event.processing.failure.handling.mode = warn [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.unwrap.add.headers = version,connector,name [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.port = 1433 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || key.converter.schemas.enable = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || task.class = io.debezium.connector.sqlserver.SqlServerConnectorTask [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.hostname = sqlserver [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || database.password = ******** [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || value.converter.schemas.enable = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || name = sysint-sqlserver-tec-runonly-connector [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || transforms.unwrap.add.fields = schema,db,table,op,ts_ms,change_lsn,commit_lsn,event_serial_no,data_collection_order [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,531 INFO || retriable.restart?.connector.wait.ms = 10000 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,532 INFO || snapshot.mode = schema_only [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,532 INFO || transforms.insertuuid.type = com.github.cjmatta.kafka.connect.smt.InsertUuid$Value [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:09:49,536 WARN || Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,536 WARN || Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,536 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,537 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,538 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,538 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,538 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:49,877 INFO || KafkaDatabaseHistory Consumer config: {key.deserializer=org.apache.kafka.common.serialization.StringDeserializer, value.deserializer=org.apache.kafka.common.serialization.StringDeserializer, enable.auto.commit=false, group.id=tec-dbhistory, bootstrap.servers=kafka:29093, fetch.min.bytes=1, session.timeout.ms=10000, auto.offset.reset=earliest, client.id=tec-dbhistory} [io.debezium.relational.history.KafkaDatabaseHistory] 2020-10-16 09:09:49,877 INFO || KafkaDatabaseHistory Producer config: {retries=1, value.serializer=org.apache.kafka.common.serialization.StringSerializer, acks=1, batch.size=32768, max.block.ms=10000, bootstrap.servers=kafka:29093, buffer.memory=1048576, key.serializer=org.apache.kafka.common.serialization.StringSerializer, client.id=tec-dbhistory, linger.ms=0} [io.debezium.relational.history.KafkaDatabaseHistory] 2020-10-16 09:09:49,878 INFO || Requested thread factory for connector SqlServerConnector, id = tec named = db-history-config-check [io.debezium.util.Threads] 2020-10-16 09:09:49,879 INFO || ProducerConfig values:  1 batch.size = 32768 bootstrap.servers = [kafka:29093] buffer.memory = 1048576 client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 10000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000  retries = 1 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:09:49,882 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,882 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,882 INFO || Kafka startTimeMs: 1602839389882 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,884 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tec-dbhistory group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:09:49,891 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,891 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,891 INFO || Kafka startTimeMs: 1602839389891 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,892 INFO || [Producer clientId=tec-dbhistory] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:09:49,895 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:09:49,898 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000  retries = 1 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,899 WARN || The configuration 'value.serializer' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,899 WARN || The configuration 'acks' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,899 WARN || The configuration 'batch.size' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,899 WARN || The configuration 'max.block.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,899 WARN || The configuration 'buffer.memory' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,899 WARN || The configuration 'key.serializer' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,899 WARN || The configuration 'linger.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:09:49,900 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,900 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,900 INFO || Kafka startTimeMs: 1602839389900 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:09:49,920 INFO || Database history topic '(name=it.company.sysint.data.cdc.db.history.tec, numPartitions=1, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=delete, retention.ms=9223372036854775807, retention.bytes=-1})' created [io.debezium.relational.history.KafkaDatabaseHistory] 2020-10-16 09:09:50,243 INFO || Requested thread factory for connector SqlServerConnector, id = tec named = change-event-source-coordinator [io.debezium.util.Threads] 2020-10-16 09:09:50,244 INFO || Creating thread debezium-sqlserverconnector-tec-change-event-source-coordinator [io.debezium.util.Threads] 2020-10-16 09:09:50,245 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Source task finished initialization and start [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:09:50,246 INFO || Metrics registered [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:09:50,247 INFO || Context created [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:09:50,250 INFO || No previous offset has been found [io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource] 2020-10-16 09:09:50,250 INFO || According to the connector configuration only schema will be snapshotted [io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource] 2020-10-16 09:09:50,250 INFO || Snapshot step 1 - Preparing [io.debezium.relational.RelationalSnapshotChangeEventSource] 2020-10-16 09:09:50,251 INFO || Snapshot step 2 - Determining captured tables [io.debezium.relational.RelationalSnapshotChangeEventSource] 2020-10-16 09:09:50,398 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:09:50,400 INFO || Snapshot step 3 - Locking captured tables [io.debezium.relational.RelationalSnapshotChangeEventSource] 2020-10-16 09:09:50,401 INFO || Schema locking was disabled in connector configuration [io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource] 2020-10-16 09:09:50,401 INFO || Snapshot step 4 - Determining snapshot offset [io.debezium.relational.RelationalSnapshotChangeEventSource] 2020-10-16 09:09:50,405 INFO || Snapshot step 5 - Reading structure of captured tables [io.debezium.relational.RelationalSnapshotChangeEventSource] 2020-10-16 09:09:50,405 INFO || Reading structure of schema 'tec' [io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource] 2020-10-16 09:09:50,905 WARN || Cannot parse column default value '(NULL)' to type 'int'. [io.debezium.connector.sqlserver.SqlServerDefaultValueConverter] java.lang.NumberFormatException: For input string: "UL" at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.base/java.lang.Integer.parseInt(Integer.java:652) at java.base/java.lang.Integer.parseInt(Integer.java:770) at io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$1(SqlServerDefaultValueConverter.java:115) at io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:82) at io.debezium.connector.sqlserver.SqlServerConnection.getDefaultValue(SqlServerConnection.java:512) at io.debezium.jdbc.JdbcConnection.readTableColumn(JdbcConnection.java:1181) at io.debezium.jdbc.JdbcConnection.readSchema(JdbcConnection.java:1126) at io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource.readTableStructure(SqlServerSnapshotChangeEventSource.java:183) at io.debezium.relational.RelationalSnapshotChangeEventSource.doExecute(RelationalSnapshotChangeEventSource.java:122) at io.debezium.pipeline.source.AbstractSnapshotChangeEventSource.execute(AbstractSnapshotChangeEventSource.java:63) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:105) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) 2020-10-16 09:09:50,921 WARN || Cannot parse column default value '(NULL)' to type 'int'. [io.debezium.connector.sqlserver.SqlServerDefaultValueConverter] java.lang.NumberFormatException: For input string: "UL" at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.base/java.lang.Integer.parseInt(Integer.java:652) at java.base/java.lang.Integer.parseInt(Integer.java:770) at io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$1(SqlServerDefaultValueConverter.java:115) at io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:82) at io.debezium.connector.sqlserver.SqlServerConnection.getDefaultValue(SqlServerConnection.java:512) at io.debezium.jdbc.JdbcConnection.readTableColumn(JdbcConnection.java:1181) at io.debezium.jdbc.JdbcConnection.readSchema(JdbcConnection.java:1126) at io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource.readTableStructure(SqlServerSnapshotChangeEventSource.java:183) at io.debezium.relational.RelationalSnapshotChangeEventSource.doExecute(RelationalSnapshotChangeEventSource.java:122) at io.debezium.pipeline.source.AbstractSnapshotChangeEventSource.execute(AbstractSnapshotChangeEventSource.java:63) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:105) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) 2020-10-16 09:09:51,113 INFO || Snapshot step 6 - Persisting schema history [io.debezium.relational.RelationalSnapshotChangeEventSource] 2020-10-16 09:09:51,190 INFO || Snapshot step 7 - Skipping snapshotting of data [io.debezium.relational.RelationalSnapshotChangeEventSource] 2020-10-16 09:09:51,193 INFO || Snapshot - Final stage [io.debezium.pipeline.source.AbstractSnapshotChangeEventSource] 2020-10-16 09:09:51,194 INFO || Removing locking timeout [io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource] 2020-10-16 09:09:51,195 INFO || Snapshot ended with SnapshotResult [status=COMPLETED, offset=SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=tec, changeLsn=NULL, commitLsn=0000003f:00001040:0010, eventSerialNo=null, snapshot=FALSE, sourceTime=2020-10-16T09:09:51.188Z], partition={server=tec}, snapshotCompleted=true, eventSerialNo=1]] [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:09:51,197 INFO || Connected metrics set to 'true' [io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics] 2020-10-16 09:09:51,197 INFO || Starting streaming [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:09:51,230 INFO || CDC is enabled for table Capture instance "dbo_VatType" [sourceTableId=tec.dbo.VatType, changeTableId=tec.cdc.dbo_VatType_CT, startLsn=0000003f:00000038:0042, changeTableObjectId=683149479, stopLsn=NULL] but the table is not whitelisted by connector [io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource] 2020-10-16 09:09:51,230 INFO || CDC is enabled for table Capture instance "dbo_Registry" [sourceTableId=tec.dbo.Registry, changeTableId=tec.cdc.dbo_Registry_CT, startLsn=0000003f:000007d8:011f, changeTableObjectId=779149821, stopLsn=NULL] but the table is not whitelisted by connector [io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource] 2020-10-16 09:09:51,231 INFO || Last position recorded in offsets is 0000003f:00001040:0010(NULL)[1] [io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource] 2020-10-16 09:09:51,261 WARN || [Producer clientId=connector-producer-sysint-sqlserver-tec-runonly-connector-0] Error while fetching metadata with correlation id 3 : {it.company.sysint.data.cdc.tables.tec=LEADER_NOT_AVAILABLE} [org.apache.kafka.clients.NetworkClient] 2020-10-16 09:09:54,521 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:09:54,521 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:09:54,530 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Finished commitOffsets successfully in 9 ms [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:09:59,530 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:09:59,530 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:04,531 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:04,531 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:09,531 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:09,531 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:14,531 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:14,531 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:19,532 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:19,532 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:24,532 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:24,532 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:29,533 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:29,533 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:34,533 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:34,533 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:39,533 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:39,534 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:44,534 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:44,534 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:49,534 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:49,534 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:54,535 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:54,535 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:59,535 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:10:59,535 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:04,536 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:04,536 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:09,536 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:09,536 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:14,537 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:14,537 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:19,537 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:19,537 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:24,537 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:24,537 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:29,538 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:29,538 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:34,538 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:34,538 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:39,539 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:39,539 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:44,539 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:44,539 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:49,539 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:49,540 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:54,540 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:54,540 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:59,540 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:11:59,540 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:04,541 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:04,541 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:09,541 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:09,541 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:14,541 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:14,542 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:19,542 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:19,542 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:24,542 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:24,542 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:29,543 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:29,543 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:33,254 WARN || [Producer clientId=connector-producer-sysint-sqlserver-tec-runonly-connector-0] Error while fetching metadata with correlation id 10 : {it.company.sysint.data.cdc.tables.tec.transaction=LEADER_NOT_AVAILABLE} [org.apache.kafka.clients.NetworkClient] 2020-10-16 09:12:33,372 WARN || [Producer clientId=connector-producer-sysint-sqlserver-tec-runonly-connector-0] Error while fetching metadata with correlation id 13 : {it.company.sysint.data.cdc.tables.tec.dbo.Payment=LEADER_NOT_AVAILABLE} [org.apache.kafka.clients.NetworkClient] 2020-10-16 09:12:34,543 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:34,543 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:34,545 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Finished commitOffsets successfully in 2 ms [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:39,546 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:39,546 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:44,546 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:44,546 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:49,547 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:49,547 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:54,547 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:54,547 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:59,547 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:12:59,548 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:03,698 ERROR || Producer failure [io.debezium.pipeline.ErrorHandler] com.microsoft.sqlserver.jdbc.SQLServerException: SQL Server did not return a response. The connection has been closed. at com.microsoft.sqlserver.jdbc.SQLServerConnection.terminate(SQLServerConnection.java:2892) at com.microsoft.sqlserver.jdbc.SQLServerConnection.terminate(SQLServerConnection.java:2881) at com.microsoft.sqlserver.jdbc.TDSReader.readPacket(IOBuffer.java:6425) at com.microsoft.sqlserver.jdbc.TDSCommand.startResponse(IOBuffer.java:7579) at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:866) at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:768) at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7194) at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2935) at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:248) at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:223) at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeQuery(SQLServerStatement.java:693) at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:623) at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:492) at io.debezium.connector.sqlserver.SqlServerConnection.getMaxLsn(SqlServerConnection.java:149) at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:128) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:140) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:113) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) 2020-10-16 09:13:03,699 INFO || Finished streaming [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:13:03,699 INFO || Connected metrics set to 'false' [io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics] 2020-10-16 09:13:03,739 WARN || Going to restart connector after 10 sec. after a retriable exception [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:03,747 INFO || [Producer clientId=tec-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. [org.apache.kafka.clients.producer.KafkaProducer] 2020-10-16 09:13:03,748 WARN || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} failed to poll records from SourceTask. Will retry operation. [org.apache.kafka.connect.runtime.WorkerSourceTask] org.apache.kafka.connect.errors.RetriableException: An exception occurred in the change event producer. This connector will be restarted. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:38) at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:283) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:140) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:113) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: SQL Server did not return a response. The connection has been closed. at com.microsoft.sqlserver.jdbc.SQLServerConnection.terminate(SQLServerConnection.java:2892) at com.microsoft.sqlserver.jdbc.SQLServerConnection.terminate(SQLServerConnection.java:2881) at com.microsoft.sqlserver.jdbc.TDSReader.readPacket(IOBuffer.java:6425) at com.microsoft.sqlserver.jdbc.TDSCommand.startResponse(IOBuffer.java:7579) at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:866) at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:768) at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7194) at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2935) at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:248) at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:223) at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeQuery(SQLServerStatement.java:693) at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:623) at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:492) at io.debezium.connector.sqlserver.SqlServerConnection.getMaxLsn(SqlServerConnection.java:149) at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:128) ... 7 more 2020-10-16 09:13:03,749 INFO || Awaiting end of restart backoff period after a retriable error [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:04,548 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:04,548 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:05,750 INFO || Awaiting end of restart backoff period after a retriable error [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:07,750 INFO || Awaiting end of restart backoff period after a retriable error [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:09,548 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:09,548 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:09,751 INFO || Awaiting end of restart backoff period after a retriable error [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:11,751 INFO || Awaiting end of restart backoff period after a retriable error [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,751 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,752 INFO || Starting SqlServerConnectorTask with configuration: [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || connector.class = io.debezium.connector.sqlserver.SqlServerConnector [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || errors.log.include.messages = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms.unwrap.delete.handling.mode = rewrite [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || tasks.max = 1 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || database.history.kafka.topic = it.company.sysint.data.cdc.db.history.tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms = unwrap,route,insertuuid [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || provide.transaction.metadata = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || table.whitelist = dbo.tab1,dbo.tab2 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || tombstones.on.delete = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms.route.type = org.apache.kafka.connect.transforms.RegexRouter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms.route.regex = (.*) [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || decimal.handling.mode = string [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms.unwrap.drop.tombstones = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms.unwrap.type = io.debezium.transforms.ExtractNewRecordState [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || value.converter = org.apache.kafka.connect.json.JsonConverter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || errors.log.enable = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || key.converter = org.apache.kafka.connect.json.JsonConverter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || database.user = sa [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || database.dbname = tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || database.history.kafka.bootstrap.servers = kafka:29093 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || transforms.insertuuid.uuid.field.name = __uuid [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || database.server.name = tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || snapshot.isolation.mode = read_committed [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,752 INFO || event.processing.failure.handling.mode = warn [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || transforms.unwrap.add.headers = version,connector,name [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || database.port = 1433 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || key.converter.schemas.enable = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || task.class = io.debezium.connector.sqlserver.SqlServerConnectorTask [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || database.hostname = sqlserver [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || database.password = ******** [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || value.converter.schemas.enable = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || name = sysint-sqlserver-tec-runonly-connector [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || transforms.unwrap.add.fields = schema,db,table,op,ts_ms,change_lsn,commit_lsn,event_serial_no,data_collection_order [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || retriable.restart?.connector.wait.ms = 10000 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || snapshot.mode = schema_only [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 INFO || transforms.insertuuid.type = com.github.cjmatta.kafka.connect.smt.InsertUuid$Value [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:13,753 WARN || Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,753 WARN || Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,753 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,753 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,753 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,753 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:13,753 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:13:14,549 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:14,549 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:14,549 WARN || Couldn't commit processed log positions with the source database due to a concurrent connector shutdown or restart [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:19,549 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:19,550 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:19,550 WARN || Couldn't commit processed log positions with the source database due to a concurrent connector shutdown or restart [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:24,550 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:24,550 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:24,550 WARN || Couldn't commit processed log positions with the source database due to a concurrent connector shutdown or restart [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:28,279 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:28,279 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:28,279 ERROR || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Task threw an uncaught and unrecoverable exception [org.apache.kafka.connect.runtime.WorkerTask] java.lang.RuntimeException: Couldn't obtain database name at io.debezium.connector.sqlserver.SqlServerConnection.retrieveRealDatabaseName(SqlServerConnection.java:474) at io.debezium.connector.sqlserver.SqlServerConnection.(SqlServerConnection.java:117) at io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:75) at io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:106) at io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:161) at io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:124) at org.apache.kafka.connect.runtime.WorkerSourceTask.poll(WorkerSourceTask.java:289) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:256) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:185) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:235) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host sqlserver, port 1433 has failed. Error: "sqlserver. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.". at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:234) at com.microsoft.sqlserver.jdbc.SQLServerException.ConvertConnectExceptionToSQLServerException(SQLServerException.java:285) at com.microsoft.sqlserver.jdbc.SocketFinder.findSocket(IOBuffer.java:2431) at com.microsoft.sqlserver.jdbc.TDSChannel.open(IOBuffer.java:656) at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2440) at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:2103) at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1950) at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1162) at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:735) at io.debezium.jdbc.JdbcConnection.lambda$patternBasedFactory$1(JdbcConnection.java:222) at io.debezium.jdbc.JdbcConnection$ConnectionFactoryDecorator.connect(JdbcConnection.java:107) at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:852) at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:847) at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:618) at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:492) at io.debezium.connector.sqlserver.SqlServerConnection.retrieveRealDatabaseName(SqlServerConnection.java:469)  ... 14 more 2020-10-16 09:13:28,280 ERROR || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Task is being killed and will not recover until manually restarted [org.apache.kafka.connect.runtime.WorkerTask] 2020-10-16 09:13:28,280 INFO || Stopping down connector [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:13:28,280 WARN || Unable to unregister the MBean 'debezium.sql_server:type=connector-metrics,context=snapshot,server=tec': debezium.sql_server:type=connector-metrics,context=snapshot,server=tec [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:13:28,280 WARN || Unable to unregister the MBean 'debezium.sql_server:type=connector-metrics,context=streaming,server=tec': debezium.sql_server:type=connector-metrics,context=streaming,server=tec [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:13:28,280 WARN || Unable to unregister the MBean 'debezium.sql_server:type=connector-metrics,context=schema-history,server=tec': debezium.sql_server:type=connector-metrics,context=schema-history,server=tec [io.debezium.relational.history.DatabaseHistoryMetrics] 2020-10-16 09:13:28,280 INFO || [Producer clientId=connector-producer-sysint-sqlserver-tec-runonly-connector-0] Closing the Kafka producer with timeoutMillis = 30000 ms. [org.apache.kafka.clients.producer.KafkaProducer] 2020-10-16 09:13:29,550 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:29,551 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:34,551 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:34,551 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:39,551 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:39,551 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:44,552 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:44,552 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:49,552 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:49,552 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:54,553 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:54,553 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:59,553 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:13:59,553 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:04,554 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:04,554 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:09,554 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:09,554 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:14,554 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:14,555 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:19,555 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:19,555 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:24,555 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:24,555 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:29,556 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:29,556 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:34,556 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:34,556 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:39,556 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:39,556 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:44,557 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:44,557 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:49,557 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:49,557 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:54,557 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:54,557 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:59,558 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:14:59,558 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:04,558 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:04,558 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:09,559 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:09,559 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:14,559 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:14,559 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:19,560 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:19,560 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:24,560 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:24,560 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:29,323 INFO || Kafka Connect stopping [org.apache.kafka.connect.runtime.Connect] 2020-10-16 09:15:29,324 INFO || Stopping REST server [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:29,328 INFO || Stopped http_172.18.0.68083@6f89292e{HTTP/1.1,[http/1.1]}{172.18.0.6:8083} [org.eclipse.jetty.server.AbstractConnector] 2020-10-16 09:15:29,329 INFO || node0 Stopped scavenging [org.eclipse.jetty.server.session] 2020-10-16 09:15:29,330 INFO || REST server stopped [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:29,330 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Herder stopping [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:29,330 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Stopping connectors and tasks that are still assigned to this worker. [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:29,331 INFO || Stopping connector sysint-sqlserver-tec-runonly-connector [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:29,331 INFO || Scheduled shutdown for WorkerConnector{id=sysint-sqlserver-tec-runonly-connector} [org.apache.kafka.connect.runtime.WorkerConnector] 2020-10-16 09:15:29,331 INFO || Stopping task sysint-sqlserver-tec-runonly-connector-0 [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:29,332 INFO || Completed shutdown for WorkerConnector{id=sysint-sqlserver-tec-runonly-connector} [org.apache.kafka.connect.runtime.WorkerConnector] 2020-10-16 09:15:29,336 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Member connect-1-d2e0a5b7-877e-469e-b9fe-d73d9d8549d0 sending LeaveGroup request to coordinator kafka:29093 (id: 2147483646 rack: null) due to the consumer is being closed [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:29,336 WARN || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Close timed out with 1 pending requests to coordinator, terminating client connections [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:29,337 INFO || Stopping KafkaBasedLog for topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:29,338 INFO || [Producer clientId=producer-2] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. [org.apache.kafka.clients.producer.KafkaProducer] 2020-10-16 09:15:29,346 INFO || Stopped KafkaBasedLog for topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:29,346 INFO || Closing KafkaConfigBackingStore [org.apache.kafka.connect.storage.KafkaConfigBackingStore] 2020-10-16 09:15:29,346 INFO || Stopping KafkaBasedLog for topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:29,346 INFO || [Producer clientId=producer-3] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. [org.apache.kafka.clients.producer.KafkaProducer] 2020-10-16 09:15:29,348 INFO || Stopped KafkaBasedLog for topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:29,348 INFO || Closed KafkaConfigBackingStore [org.apache.kafka.connect.storage.KafkaConfigBackingStore] 2020-10-16 09:15:29,348 INFO || Worker stopping [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:29,348 INFO || Stopping KafkaOffsetBackingStore [org.apache.kafka.connect.storage.KafkaOffsetBackingStore] 2020-10-16 09:15:29,348 INFO || Stopping KafkaBasedLog for topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:29,349 INFO || [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. [org.apache.kafka.clients.producer.KafkaProducer] 2020-10-16 09:15:29,351 INFO || Stopped KafkaBasedLog for topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:29,351 INFO || Stopped KafkaOffsetBackingStore [org.apache.kafka.connect.storage.KafkaOffsetBackingStore] 2020-10-16 09:15:29,351 INFO || Worker stopped [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:29,351 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Herder stopped [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:29,353 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Herder stopped [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:29,354 INFO || Kafka Connect stopped [org.apache.kafka.connect.runtime.Connect] Plugins are loaded from /kafka/connect Using the following environment variables: GROUP_ID=sysint-kafka-connect CONFIG_STORAGE_TOPIC=_sysint_connect_configs OFFSET_STORAGE_TOPIC=_sysint_connect_offsets STATUS_STORAGE_TOPIC=_sysint_connect_status BOOTSTRAP_SERVERS=kafka:29093 REST_HOST_NAME=172.18.0.6 REST_PORT=8083 ADVERTISED_HOST_NAME=172.18.0.6 ADVERTISED_PORT=8083 KEY_CONVERTER=org.apache.kafka.connect.json.JsonConverter VALUE_CONVERTER=org.apache.kafka.connect.json.JsonConverter INTERNAL_KEY_CONVERTER=org.apache.kafka.connect.json.JsonConverter INTERNAL_VALUE_CONVERTER=org.apache.kafka.connect.json.JsonConverter OFFSET_FLUSH_INTERVAL_MS=5000 OFFSET_FLUSH_TIMEOUT_MS=5000 SHUTDOWN_TIMEOUT=10000 --- Setting property from CONNECT_INTERNAL_VALUE_CONVERTER: internal.value.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_VALUE_CONVERTER: value.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_REST_ADVERTISED_HOST_NAME: rest.advertised.host.name=172.18.0.6 --- Setting property from CONNECT_OFFSET_FLUSH_INTERVAL_MS: offset.flush.interval.ms=5000 --- Setting property from CONNECT_GROUP_ID: group.id=sysint-kafka-connect --- Setting property from CONNECT_BOOTSTRAP_SERVERS: bootstrap.servers=kafka:29093 --- Setting property from CONNECT_KEY_CONVERTER: key.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_TASK_SHUTDOWN_GRACEFUL_TIMEOUT_MS: task.shutdown.graceful.timeout.ms=10000 --- Setting property from CONNECT_REST_HOST_NAME: rest.host.name=172.18.0.6 --- Setting property from CONNECT_PLUGIN_PATH: plugin.path=/kafka/connect --- Setting property from CONNECT_REST_PORT: rest.port=8083 --- Setting property from CONNECT_OFFSET_FLUSH_TIMEOUT_MS: offset.flush.timeout.ms=5000 --- Setting property from CONNECT_STATUS_STORAGE_TOPIC: status.storage.topic=_sysint_connect_status --- Setting property from CONNECT_INTERNAL_KEY_CONVERTER: internal.key.converter=org.apache.kafka.connect.json.JsonConverter --- Setting property from CONNECT_CONFIG_STORAGE_TOPIC: config.storage.topic=_sysint_connect_configs --- Setting property from CONNECT_REST_ADVERTISED_PORT: rest.advertised.port=8083 --- Setting property from CONNECT_OFFSET_STORAGE_TOPIC: offset.storage.topic=_sysint_connect_offsets 2020-10-16 09:15:43,581 INFO || WorkerInfo values: jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dcom.sun.management.jmxremote.port=1976, -Dkafka.logs.dir=/kafka/bin/../logs, -Dlog4j.configuration=file:/kafka/config/log4j.properties, -javaagent:/kafka/jmx_prometheus_javaagent.jar=8080:/kafka/config.yml jvm.spec = Oracle Corporation, OpenJDK 64-Bit Server VM, 11.0.8, 11.0.8+10-LTS jvm.classpath = /kafka/bin/../libs/activation-1.1.1.jar:/kafka/bin/../libs/aopalliance-repackaged-2.5.0.jar:/kafka/bin/../libs/argparse4j-0.7.0.jar:/kafka/bin/../libs/audience-annotations-0.5.0.jar:/kafka/bin/../libs/avro-1.9.2.jar:/kafka/bin/../libs/common-config-5.5.1.jar:/kafka/bin/../libs/common-utils-5.5.1.jar:/kafka/bin/../libs/commons-cli-1.4.jar:/kafka/bin/../libs/commons-lang3-3.8.1.jar:/kafka/bin/../libs/connect-api-2.6.0.jar:/kafka/bin/../libs/connect-basic-auth-extension-2.6.0.jar:/kafka/bin/../libs/connect-file-2.6.0.jar:/kafka/bin/../libs/connect-json-2.6.0.jar:/kafka/bin/../libs/connect-mirror-2.6.0.jar:/kafka/bin/../libs/connect-mirror-client-2.6.0.jar:/kafka/bin/../libs/connect-runtime-2.6.0.jar:/kafka/bin/../libs/connect-transforms-2.6.0.jar:/kafka/bin/../libs/hk2-api-2.5.0.jar:/kafka/bin/../libs/hk2-locator-2.5.0.jar:/kafka/bin/../libs/hk2-utils-2.5.0.jar:/kafka/bin/../libs/jackson-annotations-2.10.2.jar:/kafka/bin/../libs/jackson-core-2.10.2.jar:/kafka/bin/../libs/jackson-databind-2.10.2.jar:/kafka/bin/../libs/jackson-dataformat-csv-2.10.2.jar:/kafka/bin/../libs/jackson-datatype-jdk8-2.10.2.jar:/kafka/bin/../libs/jackson-jaxrs-base-2.10.2.jar:/kafka/bin/../libs/jackson-jaxrs-json-provider-2.10.2.jar:/kafka/bin/../libs/jackson-module-jaxb-annotations-2.10.2.jar:/kafka/bin/../libs/jackson-module-paranamer-2.10.2.jar:/kafka/bin/../libs/jackson-module-scala_2.12-2.10.2.jar:/kafka/bin/../libs/jakarta.activation-api-1.2.1.jar:/kafka/bin/../libs/jakarta.annotation-api-1.3.4.jar:/kafka/bin/../libs/jakarta.inject-2.5.0.jar:/kafka/bin/../libs/jakarta.ws.rs-api-2.1.5.jar:/kafka/bin/../libs/jakarta.xml.bind-api-2.3.2.jar:/kafka/bin/../libs/javassist-3.22.0-CR2.jar:/kafka/bin/../libs/javassist-3.26.0-GA.jar:/kafka/bin/../libs/javax.servlet-api-3.1.0.jar:/kafka/bin/../libs/javax.ws.rs-api-2.1.1.jar:/kafka/bin/../libs/jaxb-api-2.3.0.jar:/kafka/bin/../libs/jersey-client-2.28.jar:/kafka/bin/../libs/jersey-common-2.28.jar:/kafka/bin/../libs/jersey-container-servlet-2.28.jar:/kafka/bin/../libs/jersey-container-servlet-core-2.28.jar:/kafka/bin/../libs/jersey-hk2-2.28.jar:/kafka/bin/../libs/jersey-media-jaxb-2.28.jar:/kafka/bin/../libs/jersey-server-2.28.jar:/kafka/bin/../libs/jetty-client-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-continuation-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-http-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-io-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-security-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-server-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-servlet-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-servlets-9.4.24.v20191120.jar:/kafka/bin/../libs/jetty-util-9.4.24.v20191120.jar:/kafka/bin/../libs/jopt-simple-5.0.4.jar:/kafka/bin/../libs/kafka-avro-serializer-5.5.1.jar:/kafka/bin/../libs/kafka-clients-2.6.0.jar:/kafka/bin/../libs/kafka-connect-avro-converter-5.5.1.jar:/kafka/bin/../libs/kafka-connect-avro-data-5.5.1.jar:/kafka/bin/../libs/kafka-log4j-appender-2.6.0.jar:/kafka/bin/../libs/kafka-schema-registry-client-5.5.1.jar:/kafka/bin/../libs/kafka-schema-serializer-5.5.1.jar:/kafka/bin/../libs/kafka-streams-2.6.0.jar:/kafka/bin/../libs/kafka-streams-examples-2.6.0.jar:/kafka/bin/../libs/kafka-streams-scala_2.12-2.6.0.jar:/kafka/bin/../libs/kafka-streams-test-utils-2.6.0.jar:/kafka/bin/../libs/kafka-tools-2.6.0.jar:/kafka/bin/../libs/kafka_2.12-2.6.0.jar:/kafka/bin/../libs/log4j-1.2.17.jar:/kafka/bin/../libs/lz4-java-1.7.1.jar:/kafka/bin/../libs/maven-artifact-3.6.3.jar:/kafka/bin/../libs/metrics-core-2.2.0.jar:/kafka/bin/../libs/netty-buffer-4.1.50.Final.jar:/kafka/bin/../libs/netty-codec-4.1.50.Final.jar:/kafka/bin/../libs/netty-common-4.1.50.Final.jar:/kafka/bin/../libs/netty-handler-4.1.50.Final.jar:/kafka/bin/../libs/netty-resolver-4.1.50.Final.jar:/kafka/bin/../libs/netty-transport-4.1.50.Final.jar:/kafka/bin/../libs/netty-transport-native-epoll-4.1.50.Final.jar:/kafka/bin/../libs/netty-transport-native-unix-common-4.1.50.Final.jar:/kafka/bin/../libs/osgi-resource-locator-1.0.1.jar:/kafka/bin/../libs/paranamer-2.8.jar:/kafka/bin/../libs/plexus-utils-3.2.1.jar:/kafka/bin/../libs/reflections-0.9.12.jar:/kafka/bin/../libs/rocksdbjni-5.18.4.jar:/kafka/bin/../libs/scala-collection-compat_2.12-2.1.6.jar:/kafka/bin/../libs/scala-java8-compat_2.12-0.9.1.jar:/kafka/bin/../libs/scala-library-2.12.11.jar:/kafka/bin/../libs/scala-logging_2.12-3.9.2.jar:/kafka/bin/../libs/scala-reflect-2.12.11.jar:/kafka/bin/../libs/slf4j-api-1.7.30.jar:/kafka/bin/../libs/slf4j-log4j12-1.7.30.jar:/kafka/bin/../libs/snappy-java-1.1.7.3.jar:/kafka/bin/../libs/validation-api-2.0.1.Final.jar:/kafka/bin/../libs/zookeeper-3.5.8.jar:/kafka/bin/../libs/zookeeper-jute-3.5.8.jar:/kafka/bin/../libs/zstd-jni-1.4.4-7.jar os.spec = Linux, amd64, 4.19.76-linuxkit os.vcpus = 4 [org.apache.kafka.connect.runtime.WorkerInfo] 2020-10-16 09:15:43,584 INFO || Scanning for plugin classes. This might take a moment ... [org.apache.kafka.connect.cli.ConnectDistributed] 2020-10-16 09:15:43,594 INFO || Loading plugin from: /kafka/connect/kafka-connect-insert-uuid [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,641 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/kafka-connect-insert-uuid/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,641 INFO || Added plugin 'com.github.cjmatta.kafka.connect.smt.InsertUuid$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,641 INFO || Added plugin 'com.github.cjmatta.kafka.connect.smt.InsertUuid$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,642 INFO || Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,642 INFO || Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,642 INFO || Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,643 INFO || Loading plugin from: /kafka/connect/debezium-connector-mongodb [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,857 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-mongodb/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,857 INFO || Added plugin 'io.debezium.connector.mongodb.MongoDbConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,857 INFO || Added plugin 'io.debezium.converters.ByteBufferConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,857 INFO || Added plugin 'io.debezium.converters.CloudEventsConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,857 INFO || Added plugin 'io.debezium.transforms.ExtractNewRecordState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,858 INFO || Added plugin 'io.debezium.transforms.outbox.EventRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,858 INFO || Added plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,858 INFO || Added plugin 'io.debezium.transforms.ByLogicalTableRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:43,858 INFO || Loading plugin from: /kafka/connect/debezium-connector-mysql [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,047 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-mysql/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,047 INFO || Added plugin 'io.debezium.connector.mysql.MySqlConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,050 INFO || Loading plugin from: /kafka/connect/debezium-connector-sqlserver [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,113 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-sqlserver/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,113 INFO || Added plugin 'io.debezium.connector.sqlserver.SqlServerConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,125 INFO || Loading plugin from: /kafka/connect/debezium-connector-postgres [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,234 INFO || Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-postgres/} [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,234 INFO || Added plugin 'io.debezium.connector.postgresql.PostgresConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Registered loader: jdk.internal.loader.ClassLoaders$AppClassLoader@3d4eac69 [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,991 INFO || Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.tools.MockConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.converters.FloatConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'io.confluent.connect.avro.AvroConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.converters.LongConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.json.JsonConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.storage.StringConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.converters.ShortConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,992 INFO || Added plugin 'org.apache.kafka.connect.transforms.Filter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,993 INFO || Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,994 INFO || Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'MongoDbConnector' and 'MongoDb' to plugin 'io.debezium.connector.mongodb.MongoDbConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'MySqlConnector' and 'MySql' to plugin 'io.debezium.connector.mysql.MySqlConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'PostgresConnector' and 'Postgres' to plugin 'io.debezium.connector.postgresql.PostgresConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'SqlServerConnector' and 'SqlServer' to plugin 'io.debezium.connector.sqlserver.SqlServerConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,995 INFO || Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'AvroConverter' and 'Avro' to plugin 'io.confluent.connect.avro.AvroConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'CloudEventsConverter' and 'CloudEvents' to plugin 'io.debezium.converters.CloudEventsConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,996 INFO || Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,997 INFO || Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'ExtractNewDocumentState' to plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'ByLogicalTableRouter' to plugin 'io.debezium.transforms.ByLogicalTableRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'ExtractNewRecordState' to plugin 'io.debezium.transforms.ExtractNewRecordState' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'EventRouter' to plugin 'io.debezium.transforms.outbox.EventRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added aliases 'PredicatedTransformation' and 'Predicated' to plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'Filter' to plugin 'org.apache.kafka.connect.transforms.Filter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,998 INFO || Added alias 'HasHeaderKey' to plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,999 INFO || Added alias 'RecordIsTombstone' to plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,999 INFO || Added alias 'TopicNameMatches' to plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,999 INFO || Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,999 INFO || Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,999 INFO || Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:44,999 INFO || Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' [org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader] 2020-10-16 09:15:45,034 INFO || DistributedConfig values: access.control.allow.methods = access.control.allow.origin = admin.listeners = null bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = config.providers = [] config.storage.replication.factor = 1 config.storage.topic = _sysint_connect_configs connect.protocol = sessioned connections.max.idle.ms = 540000 connector.client.config.override.policy = None group.id = sysint-kafka-connect header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter heartbeat.interval.ms = 3000 inter.worker.key.generation.algorithm = HmacSHA256 inter.worker.key.size = null inter.worker.key.ttl.ms = 3600000 inter.worker.signature.algorithm = HmacSHA256 inter.worker.verification.algorithms = [HmacSHA256] internal.key.converter = class org.apache.kafka.connect.json.JsonConverter internal.value.converter = class org.apache.kafka.connect.json.JsonConverter key.converter = class org.apache.kafka.connect.json.JsonConverter listeners = null metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 5000 offset.flush.timeout.ms = 5000 offset.storage.partitions = 25 offset.storage.replication.factor = 1 offset.storage.topic = _sysint_connect_offsets plugin.path = [/kafka/connect] rebalance.timeout.ms = 60000 receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 40000 response.http.headers.config = rest.advertised.host.name = 172.18.0.6 rest.advertised.listener = null rest.advertised.port = 8083 rest.extension.classes = [] rest.host.name = 172.18.0.6 rest.port = 8083 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI scheduled.rebalance.max.delay.ms = 300000 security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS status.storage.partitions = 5 status.storage.replication.factor = 1 status.storage.topic = _sysint_connect_status task.shutdown.graceful.timeout.ms = 10000 topic.creation.enable = true topic.tracking.allow.reset = true topic.tracking.enable = true value.converter = class org.apache.kafka.connect.json.JsonConverter worker.sync.timeout.ms = 3000 worker.unsync.backoff.ms = 300000 [org.apache.kafka.connect.runtime.distributed.DistributedConfig] 2020-10-16 09:15:45,035 INFO || Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. [org.apache.kafka.connect.runtime.WorkerConfig] 2020-10-16 09:15:45,035 INFO || Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. [org.apache.kafka.connect.runtime.WorkerConfig] 2020-10-16 09:15:45,036 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,038 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,088 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,089 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,089 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,089 INFO || Kafka startTimeMs: 1602839745089 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,300 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,313 INFO || Logging initialized @2169ms to org.eclipse.jetty.util.log.Slf4jLog [org.eclipse.jetty.util.log] 2020-10-16 09:15:45,342 INFO || Added connector for http://172.18.0.6:8083 [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,343 INFO || Initializing REST server [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,347 INFO || jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 11.0.8+10-LTS [org.eclipse.jetty.server.Server] 2020-10-16 09:15:45,372 INFO || Started http_172.18.0.68083@c808207{HTTP/1.1,[http/1.1]}{172.18.0.6:8083} [org.eclipse.jetty.server.AbstractConnector] 2020-10-16 09:15:45,375 INFO || Started @2228ms [org.eclipse.jetty.server.Server] 2020-10-16 09:15:45,389 INFO || Advertised URI: http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,389 INFO || REST server listening at http://172.18.0.6:8083/, advertising URL http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,389 INFO || Advertised URI: http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,389 INFO || REST admin endpoints at http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,389 INFO || Advertised URI: http://172.18.0.6:8083/ [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,390 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,390 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,394 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,395 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,395 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,395 INFO || Kafka startTimeMs: 1602839745395 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,404 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,411 INFO || Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden [org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy] 2020-10-16 09:15:45,415 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,416 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,418 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,419 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,419 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,419 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,419 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,419 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,419 INFO || Kafka startTimeMs: 1602839745419 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,429 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,433 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,434 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,434 INFO || Kafka startTimeMs: 1602839745433 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,511 INFO || JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:15:45,512 INFO || JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:15:45,512 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,512 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,514 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,514 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,514 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,514 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,514 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,515 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,515 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,515 INFO || Kafka startTimeMs: 1602839745515 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,524 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,530 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,531 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,533 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,533 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,534 INFO || Kafka startTimeMs: 1602839745533 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,543 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,546 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,547 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,548 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,549 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,549 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,549 INFO || Kafka startTimeMs: 1602839745549 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,557 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,566 INFO || Creating Kafka admin client [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,566 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,568 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,569 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,569 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,569 INFO || Kafka startTimeMs: 1602839745569 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,577 INFO || Kafka cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.connect.util.ConnectUtils] 2020-10-16 09:15:45,593 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,593 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,593 INFO || Kafka startTimeMs: 1602839745592 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,595 INFO || Kafka Connect distributed worker initialization took 2010ms [org.apache.kafka.connect.cli.ConnectDistributed] 2020-10-16 09:15:45,595 INFO || Kafka Connect starting [org.apache.kafka.connect.runtime.Connect] 2020-10-16 09:15:45,596 INFO || Initializing REST resources [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,596 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Herder starting [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:45,600 INFO || Worker starting [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:45,600 INFO || Starting KafkaOffsetBackingStore [org.apache.kafka.connect.storage.KafkaOffsetBackingStore] 2020-10-16 09:15:45,600 INFO || Starting KafkaBasedLog with topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,601 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,602 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,602 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,602 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,603 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,604 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,604 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,604 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,604 INFO || Kafka startTimeMs: 1602839745604 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,648 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-1 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,653 INFO || Adding admin resources to main listener [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:45,660 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,665 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,666 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,666 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,666 INFO || Kafka startTimeMs: 1602839745666 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,676 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-sysint-kafka-connect-1 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = sysint-kafka-connect group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,677 INFO || [Producer clientId=producer-1] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:45,692 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,693 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,694 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,694 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,694 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,694 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,694 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,694 INFO || Kafka startTimeMs: 1602839745694 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,698 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:45,713 INFO || DefaultSessionIdManager workerName=node0 [org.eclipse.jetty.server.session] 2020-10-16 09:15:45,713 INFO || No SessionScavenger set, using defaults [org.eclipse.jetty.server.session] 2020-10-16 09:15:45,714 INFO || node0 Scavenging every 600000ms [org.eclipse.jetty.server.session] 2020-10-16 09:15:45,716 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Subscribed to partition(s): _sysint_connect_offsets-0, _sysint_connect_offsets-5, _sysint_connect_offsets-10, _sysint_connect_offsets-20, _sysint_connect_offsets-15, _sysint_connect_offsets-9, _sysint_connect_offsets-11, _sysint_connect_offsets-4, _sysint_connect_offsets-16, _sysint_connect_offsets-17, _sysint_connect_offsets-3, _sysint_connect_offsets-24, _sysint_connect_offsets-23, _sysint_connect_offsets-13, _sysint_connect_offsets-18, _sysint_connect_offsets-22, _sysint_connect_offsets-2, _sysint_connect_offsets-8, _sysint_connect_offsets-12, _sysint_connect_offsets-19, _sysint_connect_offsets-14, _sysint_connect_offsets-1, _sysint_connect_offsets-6, _sysint_connect_offsets-7, _sysint_connect_offsets-21 [org.apache.kafka.clients.consumer.KafkaConsumer] 2020-10-16 09:15:45,718 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-0 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-5 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-10 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-20 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-15 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-9 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-11 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-4 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-16 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-17 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-3 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-24 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-23 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-13 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-18 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-22 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-2 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-8 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-12 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-19 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-14 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-1 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-6 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-7 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,719 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_offsets-21 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,758 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-24 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,758 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-18 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,758 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-16 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-22 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-20 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-9 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-7 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-13 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-11 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-1 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-5 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-3 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-23 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-17 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-15 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,759 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-21 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-19 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-10 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-8 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-14 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-12 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-2 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-0 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-6 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,760 INFO || [Consumer clientId=consumer-sysint-kafka-connect-1, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_offsets-4 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,789 INFO || Finished reading KafkaBasedLog for topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,789 INFO || Started KafkaBasedLog for topic _sysint_connect_offsets [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,789 INFO || Finished reading offsets topic and starting KafkaOffsetBackingStore [org.apache.kafka.connect.storage.KafkaOffsetBackingStore] 2020-10-16 09:15:45,792 INFO || Worker started [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:45,792 INFO || Starting KafkaBasedLog with topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,792 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,794 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,794 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,794 INFO || Kafka startTimeMs: 1602839745794 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,820 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-2 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000  retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,828 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,828 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,828 INFO || Kafka startTimeMs: 1602839745828 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,832 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-sysint-kafka-connect-2 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = sysint-kafka-connect group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,835 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,836 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,836 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,835 INFO || [Producer clientId=producer-2] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:45,836 INFO || Kafka startTimeMs: 1602839745836 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,840 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:45,843 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Subscribed to partition(s): _sysint_connect_status-0, _sysint_connect_status-4, _sysint_connect_status-1, _sysint_connect_status-2, _sysint_connect_status-3 [org.apache.kafka.clients.consumer.KafkaConsumer] 2020-10-16 09:15:45,844 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-0 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,844 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-4 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,844 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-1 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,844 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-2 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,844 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_status-3 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,852 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-2 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,852 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-1 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,852 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-0 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,852 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-4 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,852 INFO || [Consumer clientId=consumer-sysint-kafka-connect-2, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_status-3 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,874 INFO || Finished reading KafkaBasedLog for topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,874 INFO || Started KafkaBasedLog for topic _sysint_connect_status [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,877 INFO || Starting KafkaConfigBackingStore [org.apache.kafka.connect.storage.KafkaConfigBackingStore] 2020-10-16 09:15:45,877 INFO || Starting KafkaBasedLog with topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,878 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,882 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:45,883 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,883 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,883 INFO || Kafka startTimeMs: 1602839745882 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,897 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-3 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,900 WARN || The configuration 'group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,900 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,900 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:45,901 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,901 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,901 INFO || Kafka startTimeMs: 1602839745901 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,902 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-sysint-kafka-connect-3 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = sysint-kafka-connect group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'rest.advertised.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'task.shutdown.graceful.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'plugin.path' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'status.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'offset.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'config.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'rest.advertised.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'status.storage.topic' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'rest.host.name' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'offset.flush.timeout.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'config.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'rest.port' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'internal.key.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'internal.value.converter' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 WARN || The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:45,904 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,904 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,904 INFO || Kafka startTimeMs: 1602839745904 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:45,904 INFO || [Producer clientId=producer-3] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:45,908 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:45,912 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Subscribed to partition(s): _sysint_connect_configs-0 [org.apache.kafka.clients.consumer.KafkaConsumer] 2020-10-16 09:15:45,912 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Seeking to EARLIEST offset of partition _sysint_connect_configs-0 [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,916 INFO || [Consumer clientId=consumer-sysint-kafka-connect-3, groupId=sysint-kafka-connect] Resetting offset for partition _sysint_connect_configs-0 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:45,923 INFO || Finished reading KafkaBasedLog for topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,923 INFO || Started KafkaBasedLog for topic _sysint_connect_configs [org.apache.kafka.connect.util.KafkaBasedLog] 2020-10-16 09:15:45,923 INFO || Started KafkaConfigBackingStore [org.apache.kafka.connect.storage.KafkaConfigBackingStore] 2020-10-16 09:15:45,923 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Herder started [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:45,932 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:45,932 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Discovered group coordinator kafka:29093 (id: 2147483646 rack: null) [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:45,937 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Rebalance started [org.apache.kafka.connect.runtime.distributed.WorkerCoordinator] 2020-10-16 09:15:45,938 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:45,946 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group. [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:45,946 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:45,968 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Successfully joined group with generation 5 [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:45,970 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Joined group at generation 5 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-e3da7167-cfbb-4d95-8564-b15e17791a7f', leaderUrl='http://172.18.0.6:8083/', offset=4, connectorIds=[sysint-sqlserver-tec-runonly-connector], taskIds=[sysint-sqlserver-tec-runonly-connector-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:45,971 WARN || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Catching up to assignment's config offset. [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:45,971 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Current config state offset -1 is behind group assignment 4, reading to end of config log [org.apache.kafka.connect.runtime.distributed.DistributedHerder] Oct 16, 2020 9:15:46 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Oct 16, 2020 9:15:46 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource will be ignored. Oct 16, 2020 9:15:46 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Oct 16, 2020 9:15:46 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. Oct 16, 2020 9:15:46 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method listLoggers in org.apache.kafka.connect.runtime.rest.resources.LoggingResource contains empty path annotation. WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. 2020-10-16 09:15:46,162 INFO || Started o.e.j.s.ServletContextHandler@100c8b75{/,null,AVAILABLE} [org.eclipse.jetty.server.handler.ContextHandler] 2020-10-16 09:15:46,163 INFO || REST resources initialized; server is started and ready to handle requests [org.apache.kafka.connect.runtime.rest.RestServer] 2020-10-16 09:15:46,163 INFO || Kafka Connect started [org.apache.kafka.connect.runtime.Connect] 2020-10-16 09:15:46,420 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Finished reading to end of log and updated config snapshot, new config log offset: 4 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:46,420 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting connectors and tasks using config offset 4 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:46,422 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting task sysint-sqlserver-tec-runonly-connector-0 [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:46,422 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Starting connector sysint-sqlserver-tec-runonly-connector [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:46,425 INFO || Creating task sysint-sqlserver-tec-runonly-connector-0 [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,425 INFO || Creating connector sysint-sqlserver-tec-runonly-connector of type io.debezium.connector.sqlserver.SqlServerConnector [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,429 INFO || ConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig] 2020-10-16 09:15:46,429 INFO || SourceConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.SourceConnectorConfig] 2020-10-16 09:15:46,441 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:15:46,443 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:15:46,448 INFO || TaskConfig values: task.class = class io.debezium.connector.sqlserver.SqlServerConnectorTask [org.apache.kafka.connect.runtime.TaskConfig] 2020-10-16 09:15:46,449 INFO || Instantiated task sysint-sqlserver-tec-runonly-connector-0 with version 1.3.0.Final of type io.debezium.connector.sqlserver.SqlServerConnectorTask [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,450 INFO || Instantiated connector sysint-sqlserver-tec-runonly-connector with version 1.3.0.Final of type class io.debezium.connector.sqlserver.SqlServerConnector [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,449 INFO || JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:15:46,453 INFO || JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false [org.apache.kafka.connect.json.JsonConverterConfig] 2020-10-16 09:15:46,453 INFO || Finished creating connector sysint-sqlserver-tec-runonly-connector [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,453 INFO || Set up the key converter class org.apache.kafka.connect.json.JsonConverter for task sysint-sqlserver-tec-runonly-connector-0 using the connector config [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,453 INFO || Set up the value converter class org.apache.kafka.connect.json.JsonConverter for task sysint-sqlserver-tec-runonly-connector-0 using the connector config [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,454 INFO || Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task sysint-sqlserver-tec-runonly-connector-0 using the worker config [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,456 INFO || SourceConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.SourceConnectorConfig] 2020-10-16 09:15:46,459 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:15:46,470 INFO || Initializing: org.apache.kafka.connect.runtime.TransformationChain{io.debezium.transforms.ExtractNewRecordState, org.apache.kafka.connect.transforms.RegexRouter, com.github.cjmatta.kafka.connect.smt.InsertUuid$Value} [org.apache.kafka.connect.runtime.Worker] 2020-10-16 09:15:46,472 INFO || ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29093] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = connector-producer-sysint-sqlserver-tec-runonly-connector-0 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 9223372036854775807 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 2147483647 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:46,477 WARN || The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:46,477 WARN || The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:46,477 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,477 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,477 INFO || Kafka startTimeMs: 1602839746477 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,480 INFO || [Producer clientId=connector-producer-sysint-sqlserver-tec-runonly-connector-0] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:46,484 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Finished starting connectors and tasks [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:46,497 INFO || [Worker clientId=connect-1, groupId=sysint-kafka-connect] Session key updated [org.apache.kafka.connect.runtime.distributed.DistributedHerder] 2020-10-16 09:15:46,515 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,515 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,515 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,515 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,516 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,516 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,516 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,516 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,519 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,519 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,519 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,519 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,519 INFO || Starting SqlServerConnectorTask with configuration: [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,520 INFO || connector.class = io.debezium.connector.sqlserver.SqlServerConnector [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || errors.log.include.messages = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.unwrap.delete.handling.mode = rewrite [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || tasks.max = 1 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.history.kafka.topic = it.company.sysint.data.cdc.db.history.tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms = unwrap,route,insertuuid [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || provide.transaction.metadata = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || table.whitelist = dbo.tab1,dbo.tab2 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || tombstones.on.delete = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.route.type = org.apache.kafka.connect.transforms.RegexRouter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.route.regex = (.*) [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || decimal.handling.mode = string [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.unwrap.drop.tombstones = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.unwrap.type = io.debezium.transforms.ExtractNewRecordState [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || value.converter = org.apache.kafka.connect.json.JsonConverter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || errors.log.enable = true [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || key.converter = org.apache.kafka.connect.json.JsonConverter [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.user = sa [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.dbname = tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.history.kafka.bootstrap.servers = kafka:29093 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.insertuuid.uuid.field.name = __uuid [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.server.name = tec [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || snapshot.isolation.mode = read_committed [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || event.processing.failure.handling.mode = warn [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.unwrap.add.headers = version,connector,name [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.port = 1433 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || key.converter.schemas.enable = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || task.class = io.debezium.connector.sqlserver.SqlServerConnectorTask [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.hostname = sqlserver [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || database.password = ******** [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || value.converter.schemas.enable = false [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || name = sysint-sqlserver-tec-runonly-connector [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || transforms.unwrap.add.fields = schema,db,table,op,ts_ms,change_lsn,commit_lsn,event_serial_no,data_collection_order [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,521 INFO || retriable.restart?.connector.wait.ms = 10000 [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,522 INFO || snapshot.mode = schema_only [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,522 INFO || transforms.insertuuid.type = com.github.cjmatta.kafka.connect.smt.InsertUuid$Value [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,527 WARN || Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,527 WARN || Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,527 WARN || Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,529 WARN || Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,529 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,529 WARN || Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,529 WARN || Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead. [io.debezium.config.Configuration] 2020-10-16 09:15:46,758 INFO || KafkaDatabaseHistory Consumer config: {key.deserializer=org.apache.kafka.common.serialization.StringDeserializer, value.deserializer=org.apache.kafka.common.serialization.StringDeserializer, enable.auto.commit=false, group.id=tec-dbhistory, bootstrap.servers=kafka:29093, fetch.min.bytes=1, session.timeout.ms=10000, auto.offset.reset=earliest, client.id=tec-dbhistory} [io.debezium.relational.history.KafkaDatabaseHistory] 2020-10-16 09:15:46,758 INFO || KafkaDatabaseHistory Producer config: {retries=1, value.serializer=org.apache.kafka.common.serialization.StringSerializer, acks=1, batch.size=32768, max.block.ms=10000, bootstrap.servers=kafka:29093, buffer.memory=1048576, key.serializer=org.apache.kafka.common.serialization.StringSerializer, client.id=tec-dbhistory, linger.ms=0} [io.debezium.relational.history.KafkaDatabaseHistory] 2020-10-16 09:15:46,759 INFO || Requested thread factory for connector SqlServerConnector, id = tec named = db-history-config-check [io.debezium.util.Threads] 2020-10-16 09:15:46,761 INFO || ProducerConfig values:  1 batch.size = 32768 bootstrap.servers = [kafka:29093] buffer.memory = 1048576 client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 10000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000  retries = 1 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer [org.apache.kafka.clients.producer.ProducerConfig] 2020-10-16 09:15:46,764 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,764 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,764 INFO || Kafka startTimeMs: 1602839746764 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,764 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tec-dbhistory group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:46,771 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,771 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,771 INFO || Kafka startTimeMs: 1602839746771 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,774 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:46,776 INFO || [Producer clientId=tec-dbhistory] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:46,795 INFO || Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=tec, changeLsn=0000003f:00001360:0021, commitLsn=0000003f:00001360:0071, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=tec}, snapshotCompleted=false, eventSerialNo=1] [io.debezium.connector.common.BaseSourceTask] 2020-10-16 09:15:46,796 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tec-dbhistory group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:46,797 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,797 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,797 INFO || Kafka startTimeMs: 1602839746797 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,800 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:46,803 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tec-dbhistory group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:46,806 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,806 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,806 INFO || Kafka startTimeMs: 1602839746806 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,806 INFO || Creating thread debezium-sqlserverconnector-tec-db-history-config-check [io.debezium.util.Threads] 2020-10-16 09:15:46,810 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:46,811 INFO || AdminClientConfig values: bootstrap.servers = [kafka:29093] client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory-topic-check connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000  retries = 1 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,814 INFO || Started database history recovery [io.debezium.relational.history.DatabaseHistoryMetrics] 2020-10-16 09:15:46,818 WARN || The configuration 'value.serializer' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,820 WARN || The configuration 'acks' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,821 WARN || The configuration 'batch.size' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,821 WARN || The configuration 'max.block.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,821 WARN || The configuration 'buffer.memory' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,821 WARN || The configuration 'key.serializer' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,821 WARN || The configuration 'linger.ms' was supplied but isn't a known config. [org.apache.kafka.clients.admin.AdminClientConfig] 2020-10-16 09:15:46,821 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,821 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,821 INFO || Kafka startTimeMs: 1602839746821 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,822 INFO || ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29093] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = tec-dbhistory client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tec-dbhistory group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [org.apache.kafka.clients.consumer.ConsumerConfig] 2020-10-16 09:15:46,824 INFO || Kafka version: 2.6.0 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,824 INFO || Kafka commitId: 62abe01bee039651 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,824 INFO || Kafka startTimeMs: 1602839746824 [org.apache.kafka.common.utils.AppInfoParser] 2020-10-16 09:15:46,825 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Subscribed to topic(s): it.company.sysint.data.cdc.db.history.tec [org.apache.kafka.clients.consumer.KafkaConsumer] 2020-10-16 09:15:46,830 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Cluster ID: lAkC5TqFRP2c-SJAbs5uNA [org.apache.kafka.clients.Metadata] 2020-10-16 09:15:46,839 INFO || Database history topic 'it.company.sysint.data.cdc.db.history.tec' has correct settings [io.debezium.relational.history.KafkaDatabaseHistory] 2020-10-16 09:15:46,843 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Discovered group coordinator kafka:29093 (id: 2147483646 rack: null) [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:46,846 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:46,851 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group. [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:46,851 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] (Re-)joining group [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:46,858 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Finished assignment for group at generation 1: {tec-dbhistory-9539f335-338a-47ce-b8c9-8b5e17b60a17=Assignment(partitions=[it.company.sysint.data.cdc.db.history.tec-0])} [org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] 2020-10-16 09:15:46,862 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Successfully joined group with generation 1 [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:46,864 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Notifying assignor about the new Assignment(partitions=[it.company.sysint.data.cdc.db.history.tec-0]) [org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] 2020-10-16 09:15:46,864 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Adding newly assigned partitions: it.company.sysint.data.cdc.db.history.tec-0 [org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] 2020-10-16 09:15:46,871 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Found no committed offset for partition it.company.sysint.data.cdc.db.history.tec-0 [org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] 2020-10-16 09:15:46,872 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Resetting offset for partition it.company.sysint.data.cdc.db.history.tec-0 to offset 0. [org.apache.kafka.clients.consumer.internals.SubscriptionState] 2020-10-16 09:15:46,899 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Revoke previously assigned partitions it.company.sysint.data.cdc.db.history.tec-0 [org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] 2020-10-16 09:15:46,900 INFO || [Consumer clientId=tec-dbhistory, groupId=tec-dbhistory] Member tec-dbhistory-9539f335-338a-47ce-b8c9-8b5e17b60a17 sending LeaveGroup request to coordinator kafka:29093 (id: 2147483646 rack: null) due to the consumer is being closed [org.apache.kafka.clients.consumer.internals.AbstractCoordinator] 2020-10-16 09:15:46,903 INFO || Finished database history recovery of 15 change(s) in 89 ms [io.debezium.relational.history.DatabaseHistoryMetrics] 2020-10-16 09:15:46,927 INFO || Requested thread factory for connector SqlServerConnector, id = tec named = change-event-source-coordinator [io.debezium.util.Threads] 2020-10-16 09:15:46,929 INFO || Creating thread debezium-sqlserverconnector-tec-change-event-source-coordinator [io.debezium.util.Threads] 2020-10-16 09:15:46,929 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Source task finished initialization and start [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:46,932 INFO || Metrics registered [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:15:46,932 INFO || Context created [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:15:46,935 INFO || A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted. [io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource] 2020-10-16 09:15:46,936 INFO || Snapshot ended with SnapshotResult [status=SKIPPED, offset=SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=tec, changeLsn=0000003f:00001360:0021, commitLsn=0000003f:00001360:0071, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=tec}, snapshotCompleted=false, eventSerialNo=1]] [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:15:46,938 INFO || Connected metrics set to 'true' [io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics] 2020-10-16 09:15:46,938 INFO || Starting streaming [io.debezium.pipeline.ChangeEventSourceCoordinator] 2020-10-16 09:15:46,999 INFO || SourceConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.SourceConnectorConfig] 2020-10-16 09:15:47,007 INFO || EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.debezium.connector.sqlserver.SqlServerConnector errors.log.enable = true errors.log.include.messages = true errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.json.JsonConverter name = sysint-sqlserver-tec-runonly-connector predicates = [] tasks.max = 1 topic.creation.groups = [] transforms = [unwrap, route, insertuuid] transforms.insertuuid.negate = false transforms.insertuuid.predicate = transforms.insertuuid.type = class com.github.cjmatta.kafka.connect.smt.InsertUuid$Value transforms.insertuuid.uuid.field.name = __uuid transforms.route.negate = false transforms.route.predicate = transforms.route.regex = (.*) transforms.route.replacement = it.company.sysint.data.cdc.tables.$1 transforms.route.type = class org.apache.kafka.connect.transforms.RegexRouter transforms.unwrap.add.fields = [schema, db, table, op, ts_ms, change_lsn, commit_lsn, event_serial_no, data_collection_order] transforms.unwrap.add.headers = [version, connector, name] transforms.unwrap.delete.handling.mode = rewrite transforms.unwrap.drop.tombstones = false transforms.unwrap.negate = false transforms.unwrap.predicate = transforms.unwrap.route.by.field = transforms.unwrap.type = class io.debezium.transforms.ExtractNewRecordState value.converter = class org.apache.kafka.connect.json.JsonConverter [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig] 2020-10-16 09:15:47,026 INFO || CDC is enabled for table Capture instance "dbo_VatType" [sourceTableId=tec.dbo.VatType, changeTableId=tec.cdc.dbo_VatType_CT, startLsn=0000003f:00000038:0042, changeTableObjectId=683149479, stopLsn=NULL] but the table is not whitelisted by connector [io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource] 2020-10-16 09:15:47,027 INFO || CDC is enabled for table Capture instance "dbo_Registry" [sourceTableId=tec.dbo.Registry, changeTableId=tec.cdc.dbo_Registry_CT, startLsn=0000003f:000007d8:011f, changeTableObjectId=779149821, stopLsn=NULL] but the table is not whitelisted by connector [io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource] 2020-10-16 09:15:47,027 INFO || Last position recorded in offsets is 0000003f:00001360:0071(0000003f:00001360:0021)[1] [io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource] 2020-10-16 09:15:47,183 INFO || Skipping change ChangeTableResultSet{changeTable=Capture instance "dbo_Payment" [sourceTableId=tec.dbo.Payment, changeTableId=tec.cdc.dbo_Payment_CT, startLsn=00000037:00000b38:00b1, changeTableObjectId=1294627655, stopLsn=NULL], resultSet=SQLServerResultSet:16, completed=false, currentChangePosition=0000003f:00001360:0071(0000003f:00001360:0021)} as its order in the transaction 1 is smaller than or equal to the last recorded operation 0000003f:00001360:0071(0000003f:00001360:0021)[1] [io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource] 2020-10-16 09:15:51,483 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:51,484 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:51,493 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Finished commitOffsets successfully in 9 ms [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:56,493 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:15:56,494 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:01,494 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:01,494 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:06,494 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:06,494 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:11,495 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:11,495 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:16,495 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:16,495 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:21,495 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:21,495 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:26,496 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:26,496 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:31,496 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2020-10-16 09:16:31,497 INFO || WorkerSourceTask{id=sysint-sqlserver-tec-runonly-connector-0} flushing 0 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask]