Uploaded image for project: 'Migration Toolkit for Applications'
  1. Migration Toolkit for Applications
  2. MTA-6526

[DOC] Add Conceptual Overview of Solution Server and LLM Integration

XMLWordPrintable

    • Icon: Task Task
    • Resolution: Unresolved
    • Icon: Undefined Undefined
    • MTA 8.2.0
    • None
    • Documentation
    • None
    • Quality / Stability / Reliability
    • False
    • Hide

      None

      Show
      None
    • False
    • None

      Per internal review feedback (Comment G30), the documentation requires a foundational conceptual overview to explain the interplay between the Solution Server (SS) and the Large Language Model (LLM). Currently, the guide moves quickly into technical configuration without establishing how these components collaborate to provide AI-driven code fixes. Providing this context upfront ensures users understand the data flow and security architecture before they begin deployment.

      Comment
      Correct me if I am wrong but this seems to be a decision making point where you select the generative AI option.  The configurations that you complete post that depend on the AI option you select. 
      This part should be made clear in the title itself, that is the core focus of this section. The current title seems very generic, so something like - Generative AI options to request code resolutions.
      Also, the wording used makes the topic seems like a mix of conceptual and procedural information. Consider using something like: "An example workflow for configuring an LLM service on OpenShift AI broadly requires the following configurations:"
      in https://docs.redhat.com/en/documentation/migration_toolkit_for_applications/8.0/html/configuring_and_using_red_hat_developer_lightspeed_for_mta/configuring-llm_mta-developer-lightspeed

      Detailed Content Requirements

      The new "Conceptual Overview" section should be placed at the beginning of the configuration guide, leveraging the first two paragraphs currently found in the Solution Server configuration section. The content must address the following:

      1. Collaborative Workflow: Explain how the Solution Server acts as the intermediary agent that receives analysis findings from the MTA IDE extension or CLI and requests context-aware resolutions from the LLM.
         
      2. Architecture Flow: Describe the high-level processing flow (IDE -> Solution Server -> LLM) to address security concerns and clarify where code suggestions originate.

       

      1. Agentic Automation: Define the "Agent Mode" where the Solution Server mass-generates code resolutions for multiple migration issues without requiring direct IDE interaction for every fix.
         
      2. Security and Secrets: Briefly mention that the Solution Server manages the secure authentication to the LLM via cluster secrets, ensuring credentials are never exposed in plain text.

      Reference Hyperlinks

      Acceptance Criteria

      • [ ] A new "Conceptual Overview" or "Introduction to Integration" section is added to the guide.
      • [ ] The text clearly defines the roles of the Solution Server and the LLM in the migration pipeline.
      • [ ] The content includes a high-level summary of the end-to-end data flow (IDE to LLM).
      • [ ] Redundancy is reduced by ensuring this overview serves as the primary introduction for subsequent configuration chapters.

              rhn-support-anarnold A Arnold
              rhn-support-anarnold A Arnold
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated: