Details

    • Type: Enhancement
    • Status: Open (View Workflow)
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 5.7.1.Final
    • Fix Version/s: 6.later
    • Component/s: LRA
    • Labels:
      None

      Description

      Currently each LRA coordinator (the component responsible for starting and managing LRAs) needs its own object store. This can cause problems in dynamic environments where nodes come and go.

      If we embed the node id in the id for an LRA then we can relax this restriction (ie different coordinators can share a store). This change will also mean that a single recovery manager can recover LRAs for multiple nodes.

      Some notes on the current implementation:

      1. You may have multiple coordinators for a single tree of transactions (ie nested LRAs can be managed by a coordinator that is different from the parent).
      2. There can only be one recovery coordinator per store. Note that this [restriction of the implementation] does not impact the spec since the mechanics of recovery is a private matter (ie in the spec we only demand that recovery takes place).
      3. The only user visible aspect of recovery is the "recovery coordinator URL" that participants receive when they enlist with the LRA and the implementation of that endpoint is opaque to the user.
      4. Provided recovery takes place (ie the correct participant endpoints are called in the appropriate way) that is all the use needs to know about recovery.
      5. The administrator can ask for recovering LRAs as follows: `GET {base uri}

        /lra-coordinator/?status=

        {Completing|Compensated|etc}

        ` returns a subset of all LRAs ... (this is discussed in the spec in the section called "Interoperability with other languages").

        Gliffy Diagrams

          Attachments

            Activity

              People

              • Assignee:
                mmusgrov Michael Musgrove
                Reporter:
                mmusgrov Michael Musgrove
              • Votes:
                0 Vote for this issue
                Watchers:
                1 Start watching this issue

                Dates

                • Created:
                  Updated: