Uploaded image for project: 'Undertow'
  1. Undertow
  2. UNDERTOW-1234

BlockingHandler doesn't report content-length if it fills more than one buffer

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • 2.0.0.Beta1, 1.4.22.Final
    • 1.4.20.Final
    • None
    • None

      I'm not quite sure if this is a bug by itself but I noticed some odd occurrences. I was under the impression anytime we used the responseSenders.send(String) we should know the full content length.

      Given the following code.

          public static void main(String[] args) {
              HttpHandler handler = exchange -> {
                  int size = Integer.parseInt(exchange.getQueryParameters().get("size").getFirst());
                  String data = String.format("%0" + size + "d", 0).replace('0', '-');
                  exchange.getResponseSender().send(data);
              };
      
              EncodingHandler gzipHandler = new EncodingHandler(new ContentEncodingRepository()
                      .addEncodingHandler("gzip",
                          // This 1000 is a priority, not exactly sure what it does.
                          new GzipEncodingProvider(), 1000,
                          // Anything under a content-length of 20 will not be gzipped
                          Predicates.maxContentSize(20)))
                      .setNext(handler);
      
              BlockingHandler blockingHandler = new BlockingHandler();
              blockingHandler.setRootHandler(gzipHandler);
      
              Undertow.builder()
                      .addHttpListener(8080, "0.0.0.0", blockingHandler)
                      .build().start();
          }
      
      curl -v -s -H 'Accept-Encoding: gzip' -o /dev/null http://localhost:8080/?size=16383
      *   Trying ::1...
      * TCP_NODELAY set
      * Connected to localhost (::1) port 8080 (#0)
      > GET /?size=16383 HTTP/1.1
      > Host: localhost:8080
      > User-Agent: curl/7.51.0
      > Accept: */*
      > Accept-Encoding: gzip
      >
      < HTTP/1.1 200 OK
      < Content-Encoding: gzip
      < Connection: keep-alive
      < Content-Length: 52
      < Date: Tue, 14 Nov 2017 03:48:31 GMT
      
      curl -v -s -H 'Accept-Encoding: gzip' -o /dev/null http://localhost:8080/?size=16384
      *   Trying ::1...
      * TCP_NODELAY set
      * Connected to localhost (::1) port 8080 (#0)
      > GET /?size=16384 HTTP/1.1
      > Host: localhost:8080
      > User-Agent: curl/7.51.0
      > Accept: */*
      > Accept-Encoding: gzip
      >
      < HTTP/1.1 200 OK
      < Connection: keep-alive
      < Transfer-Encoding: chunked
      < Date: Tue, 14 Nov 2017 03:48:50 GMT
      

      The first request was gzipped and the second was not. I understand that with large file sizes using chunked we may not know the full content length but in this case we should know it.

      Also this is a bit tricky with the `Predicates.maxContentSize(20)`. For the second request when we hit the predicate the content-length is still -1 which means we do not gzip. However we should know we filled at least one buffer so far so we should turn on the gzipping. A work around would be to always turn on the gzipping with a true predicate but it seems like there may be a bug somewhere in here.

      Let me know if you need more information / clarification or if this is the expected result.

              sdouglas1@redhat.com Stuart Douglas (Inactive)
              bill-15 Bill O'Neil (Inactive)
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: