Connector throws an error: Invalid value: null used for required field: "longitude", schema type: FLOAT64 after we add a `not null` column to the source table in Postgres.
We think the cause may be:
When connector restarted, it would get the newest schema from DB, however, its position for reading the wal(maybe called lsn?) is too early, so it will get some records maybe not match the latest schema, e.g. lack of a not null column.
We fixed this issue in our environment by changing the code of debezium. Not matter is a column nullable, always make the field optional.
And something may help:
1. Debezium might loss the `default` information.
2. When adding a new not null field, It will break the schema compatibility, and if working with Kafka's hdfs-sink connector hive-integration enabled, the sink connector will failed.