Uploaded image for project: 'Infinispan'
  1. Infinispan
  2. ISPN-16286

RESP `KEYS *` hangs with persistence enabled

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • 15.0.6.Final, 15.1.0.Dev02
    • 15.1.0.Dev02
    • RESP
    • None

      The `KEYS *` operation hangs in RESP when the cache is configured with persistence. The operation hangs even if the cache is empty. Acquiring a stack trace has the following trace:

      "non-blocking-thread-node-0-p2-t10" #55 [3570308] daemon prio=5 os_prio=0 cpu=185.65ms elapsed=39.66s tid=0x0000556182a74a80 nid=3570308 waiting on condition  [0x00007fdc842fd000]
         java.lang.Thread.State: WAITING (parking)
      	at jdk.internal.misc.Unsafe.park(java.base@21.0.1/Native Method)
      	- parking to wait for  <0x00000000e3cbba18> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
      	at java.util.concurrent.locks.LockSupport.park(java.base@21.0.1/LockSupport.java:371)
      	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionNode.block(java.base@21.0.1/AbstractQueuedSynchronizer.java:519)
      	at java.util.concurrent.ForkJoinPool.unmanagedBlock(java.base@21.0.1/ForkJoinPool.java:3780)
      	at java.util.concurrent.ForkJoinPool.managedBlock(java.base@21.0.1/ForkJoinPool.java:3725)
      	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@21.0.1/AbstractQueuedSynchronizer.java:1707)
      	at io.reactivex.rxjava3.internal.operators.flowable.BlockingFlowableIterable$BlockingFlowableIterator.hasNext(BlockingFlowableIterable.java:101)
      	at org.infinispan.commons.util.Closeables$1.hasNext(Closeables.java:247)
      	at org.infinispan.stream.impl.DistributedCacheStream$1.getNext(DistributedCacheStream.java:379)
      	at org.infinispan.commons.util.AbstractIterator.hasNext(AbstractIterator.java:26)
      	at org.infinispan.server.iteration.DefaultIterationManager.next(DefaultIterationManager.java:234)
      	at org.infinispan.server.resp.commands.iteration.BaseIterationCommand.iterate(BaseIterationCommand.java:84)
      	at org.infinispan.server.resp.commands.iteration.BaseIterationCommand.initializeAndIterate(BaseIterationCommand.java:73)
      	at org.infinispan.server.resp.commands.iteration.BaseIterationCommand.perform(BaseIterationCommand.java:50)
      	at org.infinispan.server.resp.Resp3Handler.actualHandleRequest(Resp3Handler.java:85)
      	at org.infinispan.server.resp.RespRequestHandler.handleRequest(RespRequestHandler.java:87)
      	at org.infinispan.server.resp.RespHandler.handleCommandAndArguments(RespHandler.java:148)
      	at org.infinispan.server.resp.RespHandler.channelRead(RespHandler.java:130)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
      	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
      	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
      	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
      	at org.infinispan.server.core.transport.StatsChannelHandler.channelRead(StatsChannelHandler.java:28)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
      	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
      	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
      	at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:801)
      	at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:501)
      	at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:399)
      	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
      	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
      	at java.lang.Thread.runWith(java.base@21.0.1/Thread.java:1596)
      	at java.lang.Thread.run(java.base@21.0.1/Thread.java:1583)
      

      We are blocking the non-blocking thread with the Rx blocking iterator. Likely, the fix involves dispatching the iterator to the blocking manager and resuming it on Netty's event loop.

              rh-ee-jbolina Jose Bolina
              rh-ee-jbolina Jose Bolina
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated:
                Resolved: