-
Enhancement
-
Resolution: Done
-
Major
-
None
-
None
-
False
-
False
-
Undefined
-
Hello! We are using Debezium to track changes to a single table in Postgres and have an extremely large database (~350 schemas, ~200 tables per schema, ~15 columns per table). When Debezium starts up it seems to be loading the entire schema, which for one of our production-like environments can take up to a half hour of constant querying to load the required data. Since we really only want the events from a single table, is there a way to limit how much of the schema that Debezium needs to load? Or is there a better approach to this problem?
We have snapshotting set to `never` and are using the table whitelist.
This was originally reported on Gitter, and I wish there was a way to link to a thread so that I could attach it here, instead I've copied it into the comments below. I was advised to convert the thread into a feature enhancement.
- is related to
-
DBZ-1311 Reading structure of captured tables is too slow
- Closed
-
DBZ-1965 During step 5, the connector scans all tables regardless of whitelist / blacklist filtering. For schemas with a large number of tables in proportion to the whitelist / blacklist exclusions can result in significant wait times (approximately 2 sec / table)
- Closed