Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-3861

LogQL metric queries not working on OCP Console with Logging v5.7

    XMLWordPrintable

Details

    • Bug
    • Resolution: Not a Bug
    • Minor
    • None
    • None
    • Log Console, Log Storage
    • None
    • False
    • None
    • False
    • NEW
    • NEW
    • Low

    Description

      Description of problem:

      Metric queries do not seem to work or at least results are not being rendered.
      
      This query:
      
      {kubernetes_namespace_name="reversewords", kubernetes_pod_name=~"reverse-words-.*"} |= "Listening on port 8080"
      
      Returns 1 value for the last 2 hours when querying through the UI, if I want to run a metric query like this:
      
      count_over_time({kubernetes_namespace_name="reversewords", kubernetes_pod_name=~"reverse-words-.*"} |= "Listening on port 8080" [2h])
      
      I get on the UI: Warning alert:No datapoints found
      
      Checking the developer tools from my browser I see this request being done:
      
      https://console-openshift-console.apps.cluster.example.com/api/proxy/plugin/logging-view-plugin/backend/api/logs/v1/application/loki/api/v1/query_range?query=count_over_time({+kubernetes_namespace_name="reversewords",+kubernetes_pod_name=~"reverse-words-.*"+}+|=+"Listening+on+port+8080"+[2h])&start=1679901283256000000&end=1679908483256000000&limit=100
      
      And this is the result:
      
      {"status":"success","data":{"resultType":"matrix","result":[{"metric":{"kubernetes_container_name":"reversewords","kubernetes_host":"openshift-master-2.cluster.example.com","kubernetes_namespace_name":"reversewords","kubernetes_pod_name":"reverse-words-8548cc98db-g4k2m","log_type":"application"},"values":[[1679906172,"1"],[1679906200,"1"],[1679906228,"1"],[1679906256,"1"],[1679906284,"1"],[1679906312,"1"],[1679906340,"1"],[1679906368,"1"],[1679906396,"1"],[1679906424,"1"],[1679906452,"1"],[1679906480,"1"],[1679906508,"1"],[1679906536,"1"],[1679906564,"1"],[1679906592,"1"],[1679906620,"1"],[1679906648,"1"],[1679906676,"1"],[1679906704,"1"],[1679906732,"1"],[1679906760,"1"],[1679906788,"1"],[1679906816,"1"],[1679906844,"1"],[1679906872,"1"],[1679906900,"1"],[1679906928,"1"],[1679906956,"1"],[1679906984,"1"],[1679907012,"1"],[1679907040,"1"],[1679907068,"1"],[1679907096,"1"],[1679907124,"1"],[1679907152,"1"],[1679907180,"1"],[1679907208,"1"],[1679907236,"1"],[1679907264,"1"],[1679907292,"1"],[1679907320,"1"],[1679907348,"1"],[1679907376,"1"],[1679907404,"1"],[1679907432,"1"],[1679907460,"1"],[1679907488,"1"],[1679907516,"1"],[1679907544,"1"],[1679907572,"1"],[1679907600,"1"],[1679907628,"1"],[1679907656,"1"],[1679907684,"1"],[1679907712,"1"],[1679907740,"1"],[1679907768,"1"],[1679907796,"1"],[1679907824,"1"],[1679907852,"1"],[1679907880,"1"],[1679907908,"1"],[1679907936,"1"],[1679907964,"1"],[1679907992,"1"],[1679908020,"1"],[1679908048,"1"],[1679908076,"1"],[1679908104,"1"],[1679908132,"1"],[1679908160,"1"],[1679908188,"1"],[1679908216,"1"],[1679908244,"1"],[1679908272,"1"],[1679908300,"1"],[1679908328,"1"],[1679908356,"1"],[1679908384,"1"],[1679908412,"1"],[1679908440,"1"],[1679908468,"1"]]}],"stats":{"summary":{"bytesProcessedPerSecond":129602,"linesProcessedPerSecond":78,"totalBytesProcessed":3288,"totalLinesProcessed":2,"execTime":0.025369938,"queueTime":0,"subqueries":3,"totalEntriesReturned":1},"querier":{"store":{"totalChunksRef":0,"totalChunksDownloaded":0,"chunksDownloadTime":0,"chunk":{"headChunkBytes":0,"headChunkLines":0,"decompressedBytes":0,"decompressedLines":0,"compressedBytes":0,"totalDuplicates":0}}},"ingester":{"totalReached":16,"totalChunksMatched":1,"totalBatches":1,"totalLinesSent":1,"store":{"totalChunksRef":0,"totalChunksDownloaded":0,"chunksDownloadTime":0,"chunk":{"headChunkBytes":3288,"headChunkLines":2,"decompressedBytes":0,"decompressedLines":0,"compressedBytes":0,"totalDuplicates":0}}},"cache":{"chunk":{"entriesFound":0,"entriesRequested":0,"entriesStored":0,"bytesReceived":0,"bytesSent":0,"requests":0},"index":{"entriesFound":0,"entriesRequested":0,"entriesStored":0,"bytesReceived":0,"bytesSent":0,"requests":0},"result":{"entriesFound":2,"entriesRequested":2,"entriesStored":1,"bytesReceived":1904,"bytesSent":0,"requests":3}}}}}
      
      This result makes me thing that the UI is not rendering the response.

      Version-Release number of selected component (if applicable):

      cluster-logging.v5.7.0
      
      loki-operator.v5.7.0

      How reproducible:

      Always
      
      I can share the environment where this is reproduced.

      Steps to Reproduce:

      1. Run a query that returns values
      2. Run a metric query (like count_over_time) for the metric above
      

      Actual results:

      No results returned

      Expected results:

      Results returned

      Additional info:

      I can share the environment

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              mavazque@redhat.com Mario Vazquez Cebrian
              Anping Li Anping Li
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: