Uploaded image for project: 'Red Hat Data Grid'
  1. Red Hat Data Grid
  2. JDG-7438

Cache org.infinispan.ROLES should wait initial state transfer

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • RHDG 8.5.2 GA
    • None
    • Core, Server
    • None

      In a constrained environment, the threads might block if the user starts issuing operations (through Hot Rod, for example) in a cache with authorization if the internal caches for ROLES or PERMISSIONS are not fully initialized.

      This renders the server unresponsive. Since the environment is constrained, it might not have free non-blocking threads to accept requests. The window to hit this scenario might be narrow.

      An example of the thread blocked:

      "non-blocking-thread--p2-t3" #45 daemon prio=5 os_prio=0 cpu=1466.34ms elapsed=1992.90s tid=0x0000558f70430000 nid=0xb1 waiting on condition  [0x00007f0ac99b0000]
         java.lang.Thread.State: TIMED_WAITING (parking)
          at jdk.internal.misc.Unsafe.park(java.base@11.0.14/Native Method)
          - parking to wait for  <0x000000009bc0d868> (a java.util.concurrent.CompletableFuture$Signaller)
          at java.util.concurrent.locks.LockSupport.parkNanos(java.base@11.0.14/LockSupport.java:234)
          at java.util.concurrent.CompletableFuture$Signaller.block(java.base@11.0.14/CompletableFuture.java:1798)
          at java.util.concurrent.ForkJoinPool.managedBlock(java.base@11.0.14/ForkJoinPool.java:3128)
          at java.util.concurrent.CompletableFuture.timedGet(java.base@11.0.14/CompletableFuture.java:1868)
          at java.util.concurrent.CompletableFuture.get(java.base@11.0.14/CompletableFuture.java:2021)
          at org.infinispan.util.concurrent.CompletableFutures.await(CompletableFutures.java:126)
          at org.infinispan.interceptors.impl.SimpleAsyncInvocationStage.get(SimpleAsyncInvocationStage.java:36)
          at org.infinispan.interceptors.impl.AsyncInterceptorChainImpl.invoke(AsyncInterceptorChainImpl.java:249)
          at org.infinispan.cache.impl.InvocationHelper.doInvoke(InvocationHelper.java:297)
          at org.infinispan.cache.impl.InvocationHelper.invoke(InvocationHelper.java:101)
          at org.infinispan.cache.impl.CacheImpl.get(CacheImpl.java:542)
          at org.infinispan.cache.impl.CacheImpl.get(CacheImpl.java:536)
          at org.infinispan.cache.impl.AbstractDelegatingCache.get(AbstractDelegatingCache.java:439)
          at org.infinispan.cache.impl.EncoderCache.get(EncoderCache.java:683)
          at org.infinispan.security.mappers.ClusterRoleMapper.principalToRoles(ClusterRoleMapper.java:48)
          at org.infinispan.security.impl.Authorizer.computeSubjectACL(Authorizer.java:153)
          at org.infinispan.security.impl.Authorizer.checkSubjectPermissionAndRole(Authorizer.java:134)
          at org.infinispan.security.impl.Authorizer.checkPermission(Authorizer.java:102)
          at org.infinispan.security.impl.Authorizer.checkPermission(Authorizer.java:83)
          at org.infinispan.security.impl.AuthorizationManagerImpl.checkPermission(AuthorizationManagerImpl.java:53)
          at org.infinispan.security.impl.SecureCacheImpl.removeAsync(SecureCacheImpl.java:637)
          at org.infinispan.server.hotrod.CacheRequestProcessor.removeInternal(CacheRequestProcessor.java:369)
          at org.infinispan.server.hotrod.CacheRequestProcessor.remove(CacheRequestProcessor.java:364)
          at org.infinispan.server.hotrod.HotRodDecoder.switch1(HotRodDecoder.java:1196)
          at org.infinispan.server.hotrod.HotRodDecoder.switch1_0(HotRodDecoder.java:156)
          at org.infinispan.server.hotrod.HotRodDecoder.decode(HotRodDecoder.java:145)
          at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:507)
          at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:446)
          at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276)
          at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
          at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
          at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
          at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
          at org.infinispan.server.core.transport.StatsChannelHandler.channelRead(StatsChannelHandler.java:28)
          at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
          at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
          at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
          at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
          at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
          at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
          at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
          at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)
          at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)
          at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)
          at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
          at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
          at java.lang.Thread.run(java.base@11.0.14/Thread.java:829)
      

              rh-ee-jbolina Jose Bolina
              rh-ee-jbolina Jose Bolina
              Anna Manukyan Anna Manukyan
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

                Created:
                Updated:
                Resolved: