-
Bug
-
Resolution: Obsolete
-
Major
-
None
-
False
-
-
False
-
Critical
What Debezium connector do you use and what version?
Debezium Informix Connector
Working version: 3.0.5
Failing version: 3.3.2
What is the connector configuration?
{ "connector.class": "io.debezium.connector.informix.InformixConnector", "errors.log.include.messages": "true", "topic.creation.default.partitions": "1", "value.converter.schema.registry.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicNameStrategy", "key.converter.schema.registry.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicNameStrategy", "transforms": "unwrap", "topic.creation.default.retention.bytes": "-1", "errors.deadletterqueue.context.headers.enable": "true", "transforms.unwrap.drop.tombstones": "false", "topic.creation.default.replication.factor": "3", "errors.deadletterqueue.topic.replication.factor": "3", "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState", "errors.log.enable": "true", "key.converter": "io.confluent.connect.avro.AvroConverter", "database.user": "***", "database.dbname": "****", "topic.creation.default.compression.type": "lz4", "topic.creation.default.segment.bytes": "1073741824", "column.exclude.list": "db.owner.table.column_name", "topic.creation.default.segment.ms": "3600000", "schema.history.internal.kafka.bootstrap.servers": "***", "value.converter.schema.registry.url": "http://***", "errors.max.retries": "2", "errors.deadletterqueue.topic.name": "informix-gpdb-source-errors", "database.password": "******", "errors.tolerance": "none", "skipped.operations": "d", "pk.mode": "kafka", "snapshot.mode": "no-data", "max.queue.size": "100000", "tasks.max": "1", "retriable.restart.connector.wait.ms": "60000", "database.connection.retry.interval.ms": "1000", "schema.history.internal.store.only.captured.tables.ddl": "true", "schema.history.internal.store.only.captured.databases.ddl": "true", "tombstones.on.delete": "true", "topic.prefix": "inst0240_net_07", "decimal.handling.mode": "double", "schema.history.internal.kafka.topic": "****", "transforms.unwrap.add.headers.prefix": "", "connection.pool.max.size": "50", "value.converter": "io.confluent.connect.avro.AvroConverter", "topic.creation.default.cleanup.policy": "delete", "time.precision.mode": "connect", "database.server.name": "inst0240_net_06", "snapshot.isolation.mode": "read_committed", "topic.creation.default.retention.ms": "864000000", "transforms.unwrap.add.headers": "db,op,table,lsn,source.ts_ms", "database.port": "2901", "schema.history.internal.kafka.recovery.poll.interval.ms": "120000", "database.hostname": "***.***.***.247", "database.connection.retries": "5", "table.include.list": "db.owner.table", "key.converter.schema.registry.url": "http://****" }What is the captured database version and mode of deployment?
Informix database on-premises
Debezium Informix connector deployed on Kafka Connect (Confluent Platform).
What behavior do you expect?
- When adding column.exclude.list for TEXT, BYTE, and BLOB columns,
the connector should skip these columns entirely, exactly like column.skip.list did in Debezium 3.0.5.
- Schema and payload should remain consistent.
- Connector should start normally and stream CDC events for remaining columns.
What behavior do you see?
- In Debezium 3.3.2, column.exclude.list does not exclude TEXT, BYTE, and BLOB columns.
- Debezium still includes these columns in CDC capture request.
- The connector fails at CDC capturing with an error code: 83717
- Same tables work perfectly on Debezium 3.0.5 with column.skip.list.
Do you see the same behaviour using the latest released Debezium version?
yes reproduced on 3.3.2 (latest stable)
Do you have the connector logs, ideally from start till finish?
(You might be asked later to provide DEBUG/TRACE level log)
Yes, full logs can be attached, including DEBUG/TRACE on request.
How to reproduce the issue using our tutorial deployment?
1- CREATE TABLE "owner".sample_table (
id SERIAL PRIMARY KEY,
name VARCHAR(50),
description TEXT,
);
2- Configure connector including property:
"column.exclude.list": "mydb.owner.sample_table.description"
3- Start the connector.
4- Connector fails with error code -83717
Feature request or enhancement
Which use case/requirement will be addressed by the proposed feature?
Users need to ** capture CDC for tables that contain unsupported or large columns (TEXT/BLOB/BYTE) by excluding them.
This functionality existed in Debezium 3.0.5 using column.skip.list, and is needed for upgrades.
Without this, CDC becomes impossible for many Informix tables.
Implementation ideas (optional)
- Ensure column.exclude.list removes the specified columns from:
-
- schema discovery
-
- emitted record structures
-
- snapshot queries
-
- CDC events
- Match functionality of previous column.skip.list.