-
Bug
-
Resolution: Not a Bug
-
Major
-
None
-
2.4.1.Final, 2.5.2.Final
-
None
-
False
-
None
-
False
-
Important
What Debezium connector do you use and what version?
Debezium SQL Server connector, version 2.5.2, but I observed the same behaviour with version 2.4.1
What is the connector configuration?
connector.class=io.debezium.connector.sqlserver.SqlServerConnector
errors.log.include.messages=true
tasks.max=3
schema.history.internal.consumer.sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;
transforms=DB1,DB2,DB3
schema.include.list=dbo,IFC
transforms.DB2.topic.regex=project\.DB2\.dbo\..*
schema.history.internal.store.only.captured.tables.ddl=true
transforms.DB3.topic.regex=project\.DB3\.IFC\..*
topic.prefix=project
schema.history.internal.kafka.topic=project.dbo.schema_changes
schema.history.internal.producer.security.protocol=SASL_SSL
database.applicationIntent=ReadWrite
poll.interval.ms=20000
transforms.DB1.topic.regex=project\.DB1\.dbo\..*
errors.log.enable=true
schema.history.internal.producer.sasl.mechanism=AWS_MSK_IAM
transforms.DB3.topic.replacement=project\.DB3\.IFC\.alltables
database.encrypt=true
transforms.DB1.type=io.debezium.transforms.ByLogicalTableRouter
schema.history.internal.consumer.sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler
database.user=user
transforms.DB2.topic.replacement=project\.DB2\.dbo\.alltables
database.names=DB1,DB2,DB3
schema.history.internal.producer.sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler
schema.history.internal.kafka.bootstrap.servers=XXXX,YYYY,ZZZZ
snapshot.isolation.mode=read_committed
database.port=XXXX
transforms.DB1.topic.replacement=project\.DB1\.dbo\.alltables
transforms.DB2.type=io.debezium.transforms.ByLogicalTableRouter
transforms.DB3.type=io.debezium.transforms.ByLogicalTableRouter
database.hostname=XXXX
database.password=password
schema.history.internal.consumer.sasl.mechanism=AWS_MSK_IAM
schema.history.internal.producer.sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;
database.trustServerCertificate=true
table.include.list=dbo.aLotOfTables,IFC.aLotofTables
schema.history.internal.consumer.security.protocol=SASL_SSL
Worker configuration:
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
config.providers.secretManager.class=com.github.jcustenborder.kafka.config.aws.SecretsManagerConfigProvider
config.providers=secretManager
config.providers.secretManager.param.aws.region=AWS region
offset.storage.topic=__amazon_msk_connect_offsets_project_topic
producer.batch.size=32768
producer.max.request.size=4194304
What is the captured database version and mode of depoyment?
Microsoft SQL Server 2016 Standard (64-bit)
13.0.6435.1
Apache Kafka Connect version
2.7.1
What behaviour do you expect?
Capture all columns for each table.
What behaviour do you see?
Debezium ignores one column in one table of type nvarchar(4000).
{"before":null, "after":
.
And there should be another "TEXT" column, of type NVARCHAR(4000). Meanwhile, Debezium does not detect it at all.
We have valid data for this column in the cdc table.
Here is a link to the zulip chat where there is additional information.
https://debezium.zulipchat.com/#narrow/stream/348252-community-sqlserver/topic/Empty.20data.20type.20-.20Debezium.20ignores.20column
Do you see the same behaviour using the latest relesead Debezium version?
I first tried with version 2.4.1, then with version 2.5.2, which was the latest at the time. Currently, I have not yet tried with version 2.6.
Do you have the connector logs, ideally from start till finish?
The logs did not indicate any error. The only odd log I noticed was:
"[Worker-0455ba77efbdf90ac] [2024-03-06 05:33:02,168] WARN [<connector-name>|task-0] Column 'TEXT' available in capture table not found among source table columns (io.debezium.connector.sqlserver.SqlServerChangeTablePointer:147)"
with the connector having been turned on 2 days earlier, and this log only appearing then.
How to reproduce the issue using our tutorial deployment?
I guess create SQL Server DB with even one table, where one column has nvarchar(4000) type and try to capture it