I'm exploring how the PG connector works in case of concurrent schema changes and updates and it seems the connector stops to work in the following scenario:
- Set a break point to PostgresSchema#refresh()
- Run this DDL: alter table customers add column middle_name varchar(255);
- Run this DML: update customers set first_name='Mary' where id = 1004; this will hit the break point set above. Wait there.
- Run this DDL: alter table customers add column another_middle_name varchar(255);
- Continue after the break point
There will be no update for the DML statement in the Kafka topic. I reckon the connector runs into some kind of error, it doesn't log anything, though.
I suspect the sequence above poses an issue either way: it seems the schema-update upon processing the DML would read the second DDL which isn't valid yet at this point, though.
- is blocked by
-
DBZ-453 Support new wal2json type specifiers
- Closed
-
DBZ-485 Protobuf message should contain type modifiers
- Closed
-
DBZ-486 Protobuf message should contain optional flags
- Closed
-
DBZ-487 Protobuf message should contain primary key information
- Open
- is duplicated by
-
DBZ-490 postgresql-connector does not detect column type changes
- Closed
- is related to
-
DBZ-454 Postgres wal2json drops unknown types during updates
- Closed
-
DBZ-512 For old connector OID should be used to detect schema change
- Closed