-
Enhancement
-
Resolution: Done
-
Major
-
None
-
None
-
False
-
None
-
False
In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.
Bug report
For bug reports, provide this information, please:
This bug was discussed here:
What Debezium connector do you use and what version?
Postgres - Debezium server 2.6.2.Final and 3.0.1
What is the connector configuration?
<Your answer>
What is the captured database version and mode of deployment?
(E.g. on-premises, with a specific cloud provider, etc.)
PostgreSQL 15 Google Cloud SQL
What behavior do you expect?
UPDATE statements should be process normally.
What behavior do you see?
UPDATE
Do you see the same behaviour using the latest released Debezium version?
(Ideally, also verify with latest Alpha/Beta/CR version)
Tested with version 3.0.1, same problem
Do you have the connector logs, ideally from start till finish?
(You might be asked later to provide DEBUG/TRACE level log)
{ "content": { "timestamp": "2024-11-14T09:14:43.229Z", "service": "debezium-server", "message": "Error while processing event at offset {lsn_proc=2860206157408, messageType=UPDATE, lsn_commit=2860206142200, lsn=2860206157408, txId=314093763, ts_usec=1731575683037157}", "attributes": { "exception": { "exceptionType": "org.apache.kafka.connect.errors.DataException", "frames": [ { "method": "validateValue", "line": 220, "class": "org.apache.kafka.connect.data.ConnectSchema" }, { "method": "validate", "line": 233, "class": "org.apache.kafka.connect.data.Struct" }, { "method": "validateValue", "line": 250, "class": "org.apache.kafka.connect.data.ConnectSchema" }, { "method": "put", "line": 216, "class": "org.apache.kafka.connect.data.Struct" }, { "method": "put", "line": 203, "class": "org.apache.kafka.connect.data.Struct" }, { "method": "update", "line": 293, "class": "io.debezium.data.Envelope" }, { "method": "emitUpdateRecord", "line": 123, "class": "io.debezium.relational.RelationalChangeRecordEmitter" }, { "method": "emitChangeRecords", "line": 53, "class": "io.debezium.relational.RelationalChangeRecordEmitter" }, { "method": "emitChangeRecords", "line": 94, "class": "io.debezium.connector.postgresql.PostgresChangeRecordEmitter" }, { "method": "dispatchDataChangeEvent", "line": 271, "class": "io.debezium.pipeline.EventDispatcher" }, { "method": "processReplicationMessages", "line": 315, "class": "io.debezium.connector.postgresql.PostgresStreamingChangeEventSource" }, { "method": "lambda$processMessages$0", "line": 217, "class": "io.debezium.connector.postgresql.PostgresStreamingChangeEventSource" }, { "method": "decodeUpdate", "line": 479, "class": "io.debezium.connector.postgresql.connection.pgoutput.PgOutputMessageDecoder" }, { "method": "processNotEmptyMessage", "line": 211, "class": "io.debezium.connector.postgresql.connection.pgoutput.PgOutputMessageDecoder" }, { "method": "processMessage", "line": 41, "class": "io.debezium.connector.postgresql.connection.AbstractMessageDecoder" }, { "method": "deserializeMessages", "line": 642, "class": "io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1" }, { "method": "readPending", "line": 634, "class": "io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1" }, { "method": "processMessages", "line": 217, "class": "io.debezium.connector.postgresql.PostgresStreamingChangeEventSource" }, { "method": "execute", "line": 179, "class": "io.debezium.connector.postgresql.PostgresStreamingChangeEventSource" }, { "method": "execute", "line": 42, "class": "io.debezium.connector.postgresql.PostgresStreamingChangeEventSource" }, { "method": "streamEvents", "line": 280, "class": "io.debezium.pipeline.ChangeEventSourceCoordinator" }, { "method": "executeChangeEventSources", "line": 197, "class": "io.debezium.pipeline.ChangeEventSourceCoordinator" }, { "method": "lambda$start$0", "line": 140, "class": "io.debezium.pipeline.ChangeEventSourceCoordinator" }, { "method": "call", "line": 515, "class": "java.util.concurrent.Executors$RunnableAdapter" }, { "method": "run", "line": 264, "class": "java.util.concurrent.FutureTask" }, { "method": "runWorker", "line": 1128, "class": "java.util.concurrent.ThreadPoolExecutor" }, { "method": "run", "line": 628, "class": "java.util.concurrent.ThreadPoolExecutor$Worker" }, { "method": "run", "line": 829, "class": "java.lang.Thread" } ], "refId": 1, "message": "Invalid value: null used for required field: \"ship_to_countries\", schema type: ARRAY" },
How to reproduce the issue using our tutorial deployment?
“Failed to properly convert data value for ‘my_db.my_schema.my_table’ of type _bpchar”
UPDATE transactions on a table that has TOAST values (storage set to extended) in columns with the BPCHAR[] data type.
In the PostgreSQL WAL, the transaction in question includes:
“my_table[character[]]: unchanged-toast-datum”
Feature request or enhancement
For feature requests or enhancements, provide this information, please:
Which use case/requirement will be addressed by the proposed feature?
<Your answer>
Implementation ideas (optional)
<Your answer>