Uploaded image for project: 'RESTEasy'
  1. RESTEasy
  2. RESTEASY-2456

JVM Heap leak in SSE

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Resolved (View Workflow)
    • Priority: Major
    • Resolution: Done
    • Affects Version/s: 4.4.1.Final, 3.9.3.Final
    • Fix Version/s: 3.10.0.Final, 4.4.2.Final
    • Component/s: None
    • Labels:
      None
    • Steps to Reproduce:
      Hide

      Run simple Quarkus-SSE application with about 100 new concurrent users per second, having container with 1CPU and 550MB RAM, for about 5 minutes.

      Show
      Run simple Quarkus-SSE application with about 100 new concurrent users per second, having container with 1CPU and 550MB RAM, for about 5 minutes.

      Description

      When using `RESTEasy`-based `SSE` application (`Quarkus`) quite quickly drains entire heap space. Quick google search shows that this is quite common problem - especially when running on limited resources. Since we cannot afford increasing the JVM heap space to 8GB for our SSE solution (like they've done it at LinkedIn - https://engineering.linkedin.com/blog/2016/10/instant-messaging-at-linkedin--scaling-to-hundreds-of-thousands-), I've spent some time trying to identify the source of the leak. Heap dumps analysis pointed me to many byte arrays left after major GC. Following references I've ended up in `SseEventOutputImpl` (`contextDataMap`) and found that it is not cleaned up on close. I found feasible `clearContextData` method in `org.jboss.resteasy.core.SynchronousDispatcher` and copied it into `SseEventOutputImpl`, where it is called from `close` method. After that - major GC is able to collect all leftovers.

        Attachments

          Activity

            People

            Assignee:
            asoldano Alessio Soldano
            Reporter:
            artur.zagozdzinski Artur Zagozdzinski (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved: