Uploaded image for project: 'Red Hat 3scale API Management'
  1. Red Hat 3scale API Management
  2. THREESCALE-5213

System-app memory usage continuously grows over time

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Critical Critical
    • None
    • 2.8 GA
    • System
    • 20
    • Not Started
    • Not Started
    • Not Started
    • Not Started
    • Not Started
    • Not Started

      It was originally observed on a cluster created for internal customer: adp-integrations after ThreeScaleContainerHighMemory alert started to fire on the May 6th - 5 days after the cluster was created (and continues to fire till now): Screenshot 2020-05-12 at 16.03.44.png.

      As per the scaling sop [1], additional pod was added to the system-app DC, which in theory should decrease the memory usage of two already present pods. However, memory usage for existing system-app pods did not decrease: Screenshot 2020-05-13 at 13.02.09.png

      Memory usage for newly added pod also grows continuously and will more than likely be the same as two other pods within hours: Screenshot 2020-05-13 at 13.01.23.png

      There are no excessive logs in any of the system-app pod's containers

      [1] https://github.com/integr8ly/middleware-load-testing/blob/master/sops/2.x/3scale-scaling.md#system-app

      Potentially related (shared by rhn-support-keprice on chat):
      https://issues.redhat.com/browse/THREESCALE-3397
      https://issues.redhat.com/browse/THREESCALE-3792

      Dev Notes
      Using tools like

      If QE needs to test this, please share how to do that and/or help them as needed.

      So these tools only tell you what is growing and which objects are created but do not tell you where those objects come from.

      So you need to find the origin of those leaks given the pattern found above

      You will need to deploy the application and feed it some traffic

            Unassigned Unassigned
            dffrench@redhat.com David Ffrench
            Votes:
            0 Vote for this issue
            Watchers:
            8 Start watching this issue

              Created:
              Updated: