Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-259

[Postgres] Interval column causes exception during handling of DELETE

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • 0.6.2
    • 0.5
    • postgresql-connector
    • None

      1. run zookeeper: docker run -it --rm --name zookeeper -p 2181:2181 -p 2888:2888 -p 3888:3888 debezium/zookeeper:0.5
      2. run kafka: docker run -it --rm --name kafka -p 9092:9092 --link zookeeper:zookeeper debezium/kafka:0.5
      3. run pg 9.6: docker run -it --rm --name debpostgres -p 5435:5432 debezium/postgres:9.6
      4. Created db called "circle_test" with one table and 3 rows added:

      CREATE TABLE issues (
          id bigserial primary key,
          title varchar(512) not null,
          time_limit interval default '60 days'::interval not null
      );
      
      insert into issues(title) values ('Foo')
      insert into issues(title) values ('Bar')
      insert into issues(title) values ('Baz')
      

      5. run kafka connect docker run -it --rm --name connect -p 8083:8083 -e GROUP_ID=1 -e CONFIG_STORAGE_TOPIC=my_connect_configs -e OFFSET_STORAGE_TOPIC=my_connect_offsets --link zookeeper:zookeeper --link kafka:kafka debezium/connect:0.5
      6. Created new connector:

      curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d '{ "name": "circle-connector", "config": { "connector.class": "io.debezium.connector.postgresql.PostgresConnector", "database.hostname": "172.17.0.1", "database.port": "5435", "database.user": "postgres", "database.password": "ignoredanyway", "database.dbname": "circle_test", "database.server.name": "circle"} }'
      

      7. Updating rows in issues table yields expected results from debezium perspective. As soon as delete operation occurs, debezium/connect throws an exception:

      2017-05-18 11:35:00,550 ERROR  Postgres|circle|records-stream-producer  Failed to properly convert data value for 'public.issues.time_limit' of type interval for row [3, null, null]:   [io.debezium.relational.TableSchemaBuilder]
      org.apache.kafka.connect.errors.DataException: Invalid Java object for schema type FLOAT64: class java.time.Duration
      	at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:233)
      	at org.apache.kafka.connect.data.Struct.put(Struct.java:215)
      	at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$3(TableSchemaBuilder.java:231)
      	at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:111)
      	at io.debezium.connector.postgresql.RecordsStreamProducer.generateDeleteRecord(RecordsStreamProducer.java:308)
      	at io.debezium.connector.postgresql.RecordsStreamProducer.process(RecordsStreamProducer.java:203)
      	at io.debezium.connector.postgresql.RecordsStreamProducer.streamChanges(RecordsStreamProducer.java:106)
      	at io.debezium.connector.postgresql.RecordsStreamProducer.lambda$start$1(RecordsStreamProducer.java:91)
      	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      	at java.lang.Thread.run(Thread.java:745)
      

            jpechane Jiri Pechanec
            gunnar.morling Gunnar Morling
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: