Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-6505

Can't start Kafka Connector after upgrade Debezium MS SQL plugin from 1.9.7 to 2.2.1

    XMLWordPrintable

Details

    • Bug
    • Resolution: Unresolved
    • Blocker
    • under-triaging
    • None
    • None
    • None
    • False
    • None
    • False
    • Critical

    Description

      In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.

      Bug report

      For bug reports, provide this information, please:

      What Debezium connector do you use and what version?

      Microsft SQL Server 2.2.1

      What is the connector configuration?

      {   
          "name""test.ssl.dc-cdata-t.sql.internal.EGDataQA.debezium",
          "config": {
              "TimeStampConverters.type""com.cfins.ds.kafka.converters.debezium.TimestampConverter",
              "binary.handling.mode""base64",
              "connector.class""io.debezium.connector.sqlserver.SqlServerConnector",
              "consumer.sasl.client.callback.handler.class""software.amazon.msk.auth.iam.IAMClientCallbackHandler",
              "consumer.sasl.jaas.config""software.amazon.msk.auth.iam.IAMLoginModule required;",
              "consumer.sasl.mechanism""AWS_MSK_IAM",
              "consumer.security.protocol""SASL_SSL",
              "converters""TimeStampConverters",
              "database.encrypt""false",
              "database.instance":"test.ssl.dc-cdata-t.sql.internal.EGDataQA.io",
              "schema.history.internal.consumer.sasl.client.callback.handler.class""software.amazon.msk.auth.iam.IAMClientCallbackHandler",
              "schema.history.internal.consumer.sasl.jaas.config""software.amazon.msk.auth.iam.IAMLoginModule required;",
              "schema.history.internal.consumer.sasl.mechanism""AWS_MSK_IAM",
              "schema.history.internal.producer.sasl.client.callback.handler.class""software.amazon.msk.auth.iam.IAMClientCallbackHandler",
              "schema.history.internal.producer.sasl.jaas.config""software.amazon.msk.auth.iam.IAMLoginModule required;",
              "schema.history.internal.producer.sasl.mechanism""AWS_MSK_IAM",
              "database.hostname""${file:/etc/kafka-utils/secrets/database_credentials/App_nonprod_credentials.properties:hostname}",
              "database.names""${file:/etc/kafka-utils/secrets/database_credentials/App_nonprod_credentials.properties:dbname}",
              "database.password""${file:/etc/kafka-utils/secrets/database_credentials/App_nonprod_credentials.properties:password}",
              "database.port""${file:/etc/kafka-utils/secrets/database_credentials/App_nonprod_credentials.properties:port}",
              "database.trustServerCertificate""false",
              "database.user""${file:/etc/kafka-utils/secrets/database_credentials/App_nonprod_credentials.properties:username}",
              "decimal.handling.mode""double",
              "heartbeat.interval.ms""60000",
              "include.schema.changes""true",
              "key.converter""io.confluent.connect.avro.AvroConverter",
              "key.converter.basic.auth.credentials.source""USER_INFO",
              "key.converter.schema.registry.basic.auth.user.info""${file:/etc/kafka-utils/secrets/container_secrets/kafka_msk_credentials.properties:schemaRegistryBasicAuth}",
              "key.converter.schema.registry.url""${file:/etc/kafka-utils/secrets/container_secrets/kafka_msk_credentials.properties:schemaRegistryUrl}",
              "producer.compression.type""snappy",
              "producer.override.max.request.size""52428800",
              "producer.reconnect.backoff.max.ms""5000",
              "producer.reconnect.backoff.ms""1000",
              "producer.sasl.client.callback.handler.class""software.amazon.msk.auth.iam.IAMClientCallbackHandler",
              "producer.sasl.jaas.config""software.amazon.msk.auth.iam.IAMLoginModule required;",
              "producer.sasl.mechanism""AWS_MSK_IAM",
              "producer.security.protocol""SASL_SSL",
          "schema.history.internal.kafka.bootstrap.servers""${file:/etc/kafka-utils/secrets/container_secrets/kafka_msk_credentials.properties:bootstrap}",
          "schema.history.internal.kafka.topic""test.ssl.dc-cdata-t.sql.internal.EGDataQA.schema-changes",
          "schema.history.internal.kafka.recovery.poll.interval.ms"100,
          "schema.history.internal.producer.security.protocol":"SASL_SSL",
          "schema.history.internal.consumer.security.protocol":"SASL_SSL",
              "table.include.list""dbo.History,dbo.Quote",
              "timestampConverter.convert.mode""generic",
              "timestampConverter.debug""true",
              "timestampConverter.format.date""YYYY-MM-dd",
              "timestampConverter.format.datetime""YYYY-MM-dd'T'HH:mm:ss.SSS",
              "timestampConverter.format.datetimetz""YYYY-MM-dd'T'HH:mm:ss.SSS'Z'",
              "timestampConverter.format.time""HH:mm:ss.SSS",
              "timestampConverter.format.timezone""UTC",
              "tombstones.on.delete""true",
              "topic.creation.default.max.message.bytes""52428800",
              "topic.creation.default.partitions""5",
              "topic.creation.default.replication.factor""2",
              "topic.creation.default.retention.ms"5259600000,
              "topic.prefix""test.ssl.dc-cdata-t.sql.internal.EGDataQA",
              "value.converter""io.confluent.connect.avro.AvroConverter",
              "value.converter.basic.auth.credentials.source""USER_INFO",
              "value.converter.schema.registry.basic.auth.user.info""${file:/etc/kafka-utils/secrets/container_secrets/kafka_msk_credentials.properties:schemaRegistryBasicAuth}",
              "value.converter.schema.registry.url""${file:/etc/kafka-utils/secrets/container_secrets/kafka_msk_credentials.properties:schemaRegistryUrl}",
              "poll.interval.ms"100,
              "snapshot.mode":"initial",
              "time.precision.mode":"adaptive",
              "schema.include.list":"dbo",
      "schema.history.internal.kafka.query.timeout.ms":3000,
      "schema.history.internal.kafka.create.timeout.ms":30000,
      "schema.history.internal.kafka.recovery.attempts":100

             }
          
      {color:#000000}}
      h2. What is the captured database version and mode of depoyment?

      (E.g. on-premises, with a specific cloud provider, etc.)

      AWS RDS
      h2. What behaviour do you expect?

      Connector should able to create since

       

      database.history.* has repalced with  schema.history.internal.*

      What behaviour do you see?

       

      REST API is retruning error message

      {"error_code":400,"message":"Connector configuration is invalid and contains the following 3 error(s):\nA value is required\nA value is required\nA value is required\nYou can also find the above list of errors at the endpoint `/connector-plugins/\{connectorType}

      /config/validate`"}

      Do you see the same behaviour using the latest relesead Debezium version?

      (Ideally, also verify with latest Alpha/Beta/CR version)

      Didn't try

      Do you have the connector logs, ideally from start till finish?

      (You might be asked later to provide DEBUG/TRACE level log)

      <Your answer>

      How to reproduce the issue using our tutorial deployment?

      <Your answer>

      Feature request or enhancement

      For feature requests or enhancements, provide this information, please:

      Which use case/requirement will be addressed by the proposed feature?

      <Your answer>

      Implementation ideas (optional)

      <Your answer>

      Attachments

        Activity

          People

            Unassigned Unassigned
            ram.nkrishna@gmail.com Ramakrishna N (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: