Uploaded image for project: 'OpenShift Logging'
  1. OpenShift Logging
  2. LOG-2765

ingester pod can not be started in IPv6 cluster

XMLWordPrintable

    • False
    • None
    • False
    • NEW
    • OBSDA-7 - Adopting Loki as an alternative to Elasticsearch to support more lightweight, easier to manage/operate storage scenarios
    • VERIFIED
    • Log Storage - Sprint 221

      Version: loki-operator.v5.5.0
      lusterType: UPI_on_Baremetal-packet_OVN-single-stack_http-proxy_Realtime-kernel

      Acked by ikanse

      Description:
      oc logs lokistack-sample-ingester-0
      level=info ts=2022-06-24T09:47:08.336937569Z caller=main.go:106 msg="Starting Loki" version="(version=HEAD-1000c2d, branch=HEAD, revision=1000c2d14)"
      level=info ts=2022-06-24T09:47:08.337267737Z caller=server.go:260 http=[::]:3100 grpc=[::]:9095 msg="server listening on addresses"
      level=info ts=2022-06-24T09:47:08.337536934Z caller=memberlist_client.go:394 msg="Using memberlist cluster node name" name=lokistack-sample-ingester-0-7843f243
      level=warn ts=2022-06-24T09:47:08.337580145Z caller=experimental.go:20 msg="experimental feature in use" feature="In-memory (FIFO) cache"
      level=warn ts=2022-06-24T09:47:08.337639084Z caller=experimental.go:20 msg="experimental feature in use" feature="In-memory (FIFO) cache"
      level=info ts=2022-06-24T09:47:08.338367117Z caller=shipper_index_client.go:111 msg="starting boltdb shipper in 2 mode"
      level=info ts=2022-06-24T09:47:08.338613508Z caller=table_manager.go:169 msg="uploading tables"
      level=error ts=2022-06-24T09:47:08.340007948Z caller=log.go:100 msg="error running loki" err="No address found for [eth0]\nerror initialising module: ingester\ngithub.com/grafana/dskit/modules.(*Manager).initModule\n\t/remote-source/loki/app/vendor/github.com/grafana/dskit/modules/modules.go:108\ngithub.com/grafana/dskit/modules.(*Manager).InitModuleServices\n\t/remote-source/loki/app/vendor/github.com/grafana/dskit/modules/modules.go:78\ngithub.com/grafana/loki/pkg/loki.(*Loki).Run\n\t/remote-source/loki/app/pkg/loki/loki.go:339\nmain.main\n\t/remote-source/loki/app/cmd/loki/main.go:108\nruntime.main\n\t/usr/lib/golang/src/runtime/proc.go:255\nruntime.goexit\n\t/usr/lib/golang/src/runtime/asm_amd64.s:1581"

      Step to Reproduce:
      1) set aws_s3 secret
      2) deploy lokistack on baremetal and use s3 as backend storage.
      3) check the pods status

      lokistack-sample-compactor-0                       1/1     Running            0               8m24s   fd01:0:0:4::36   worker-00.knarra0624.qe.devcluster.openshift.com   <none>           <none>
      lokistack-sample-distributor-67878694cf-v75m7      1/1     Running            0               8m24s   fd01:0:0:4::34   worker-00.knarra0624.qe.devcluster.openshift.com   <none>           <none>
      lokistack-sample-gateway-d97f749c5-2jp6k           2/2     Running            0               8m24s   fd01:0:0:4::35   worker-00.knarra0624.qe.devcluster.openshift.com   <none>           <none>
      lokistack-sample-index-gateway-0                   1/1     Running            0               8m24s   fd01:0:0:5::3b   worker-01.knarra0624.qe.devcluster.openshift.com   <none>           <none>
      lokistack-sample-ingester-0                        0/1     CrashLoopBackOff   5 (2m35s ago)   5m38s   fd01:0:0:4::38   worker-00.knarra0624.qe.devcluster.openshift.com   <none>           <none>
      lokistack-sample-querier-557d77ffb4-lxxzh          1/1     Running            0               8m24s   fd01:0:0:5::38   worker-01.knarra0624.qe.devcluster.openshift.com   <none>           <none>
      lokistack-sample-query-frontend-645b6c4ff7-54d78   1/1     Running            0               8m24s   fd01:0:0:5::39   worker-01.knarra0624.qe.devcluster.openshift.com   <none>           <none>
      
      

            ptsiraki@redhat.com Periklis Tsirakidis
            rhn-support-anli Anping Li
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: