Uploaded image for project: 'Infinispan'
  1. Infinispan
  2. ISPN-8347

Provide a silent way to check for Cache existence from an Hot Rod client

XMLWordPrintable

    • Icon: Enhancement Enhancement
    • Resolution: Done
    • Icon: Major Major
    • None
    • 9.1.1.Final
    • Remote Protocols
    • None

      Currently an Hot Rod client can create a new cache by using

      hotrodClient.administration().createCache( cacheName, null );
      

      But we shouldn't invoke this if the Cache might already exist.

      When we don't know if the cache might exist already, we check in advance with

      RemoteCache<?,?> cache = hotrodClient.getCache( cacheName );
      if ( cache == null ) {
          ...
      

      This works fine from a client side perspective, but it triggers to log a full stacktrace mentioning a not so reassuring ERROR :

      20:22:35,437  WARN Codec21:361 - ISPN004005: Error received from the server: org.infinispan.server.hotrod.CacheNotFoundException: Cache with name 'ENTITY_CACHE' not found amongst the configured caches
      2017-09-26 20:22:36,021 ERROR [org.infinispan.server.hotrod.CacheDecodeContext] (HotRod-ServerWorker-3-7) ISPN005003: Exception reported: org.infinispan.server.hotrod.CacheNotFoundException: Cache with name 'ANOTHER_ENTITY_CACHE' not found amongst the configured caches
      	at org.infinispan.server.hotrod.CacheDecodeContext.obtainCache(CacheDecodeContext.java:121)
      	at org.infinispan.server.hotrod.HotRodDecoder.decodeHeader(HotRodDecoder.java:160)
      	at org.infinispan.server.hotrod.HotRodDecoder.decode(HotRodDecoder.java:92)
      	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:411)
      	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:248)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
      	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
      	at org.infinispan.server.core.transport.StatsChannelHandler.channelRead(StatsChannelHandler.java:26)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
      	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
      	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926)
      	at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:1017)
      	at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:394)
      	at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:299)
      	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
      	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
      	at java.lang.Thread.run(Thread.java:748)
      
      

      I'm not sure if we could easily avoid to log such an error as this check is "normal business" for our code. Alternatively I'd welcome a new client operation to query defined/started caches.

              ttarrant@redhat.com Tristan Tarrant
              sgrinove Sanne Grinovero (Inactive)
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: