Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-494

Fix support for date arrays

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • 0.7.2
    • None
    • postgresql-connector
    • None

      When having a column of type DATE[], the schema will be derived correctly in PostgresValueConverter.schemaBuilder(), in particular, the "time precision" mode will be considered, resulting in a specific schema type.

      When the value is obtained through convertArray(), we don't convert it accordingly, though. Date values will be passed as java.util.Date, which isn't a supported value when using the "adaptive" precision mode:

      connect_1    | 2017-11-29 10:38:24,488 ERROR  Postgres|dbserver1|records-stream-producer  Failed to properly convert data value for 'inventory.customers.reg_dates' of type _date for row [1002, George, Bailey, gbailey@foobar.com, [1, 2, 3], [2014-07-07, 2014-07-08]]:   [io.debezium.relational.TableSchemaBuilder]
      connect_1    | org.apache.kafka.connect.errors.DataException: Invalid Java object for schema type INT32: class java.sql.Date for field: "null"
      connect_1    | 	at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:239)
      connect_1    | 	at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:209)
      connect_1    | 	at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:253)
      connect_1    | 	at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:209)
      connect_1    | 	at org.apache.kafka.connect.data.Struct.put(Struct.java:214)
      connect_1    | 	at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$3(TableSchemaBuilder.java:230)
      connect_1    | 	at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:111)
      connect_1    | 	at io.debezium.connector.postgresql.RecordsStreamProducer.generateUpdateRecord(RecordsStreamProducer.java:298)
      connect_1    | 	at io.debezium.connector.postgresql.RecordsStreamProducer.process(RecordsStreamProducer.java:238)
      connect_1    | 	at io.debezium.connector.postgresql.RecordsStreamProducer.lambda$streamChanges$2(RecordsStreamProducer.java:111)
      connect_1    | 	at io.debezium.connector.postgresql.connection.wal2json.Wal2JsonMessageDecoder.processMessage(Wal2JsonMessageDecoder.java:53)
      connect_1    | 	at io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1.deserializeMessages(PostgresReplicationConnection.java:196)
      connect_1    | 	at io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1.read(PostgresReplicationConnection.java:181)
      connect_1    | 	at io.debezium.connector.postgresql.RecordsStreamProducer.streamChanges(RecordsStreamProducer.java:111)
      connect_1    | 	at io.debezium.connector.postgresql.RecordsStreamProducer.lambda$start$1(RecordsStreamProducer.java:97)
      connect_1    | 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      connect_1    | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      connect_1    | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      connect_1    | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      connect_1    | 	at java.lang.Thread.run(Thread.java:748)
      

      The correct way would be to feed the array element values through the right converter type in convertArray(). It's not a problem for most other supported array types, as no value conversion is required for those (e.g. int).

              jpechane Jiri Pechanec
              gunnar.morling Gunnar Morling
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: