-
Enhancement
-
Resolution: Unresolved
-
Major
-
None
-
None
-
None
-
False
-
-
False
-
Not Started
-
Not Started
-
Not Started
-
Not Started
-
Not Started
-
Not Started
-
-
We recently added a new endpoint for batch syncing applications: https://github.com/3scale/apisonator/pull/453
In this PR, we added a new method save_application_with_data which writes all info coming from porta about one application. However, this doesn't remove old data, it just dumps whatever it has received. This translates the prior behavior but working with batches.
The current code can produce inconsistencies like rotated keys still usable or orphaned objects.
In order to be sure the data in backend is a 1:1 copy of the data in porta, what we need is a true sync function. The idea is to replace save_application_with_data by a lua script that would do same thing.
Things to consider:
- Redis supports transactions with atomicity, but it doesn't support rollbacks (blog and docs)
- Since there's no rollbacks, it's crucial that the script never fails
- Data consistency doesn't depend only on transactions completing successfully, apisonator also maintains a memoized cache we must ensure it's also synced. In particular, currently calls to app.create_key and app.create_referrer_filter invalidate cache.