-
Bug
-
Resolution: Done
-
Critical
-
34.0.0.Final
-
None
In a cluster with 2 nodes where an application is cancelling and creating persistent timers during startup in a @Singleton, @Startup bean during it's @PostConstruct, the following stack is being observed when one of the two nodes is in suspended state:
[Server:ha-node-1] 15:25:10,728 INFO [org.jboss.playground.ejb.TestTimer] (ServerService Thread Pool -- 91) INIT: going to cancel [Server:ha-node-1] 15:25:10,877 INFO [org.jboss.playground.ejb.TestTimer] (ServerService Thread Pool -- 91) Timer: id: 08079afb-210a-46a9-bd07-d6d86e183408, info: Timer created at 2024-09-25T15:15:35.791765, class: class org.jboss.as.ejb3.timerservice.d istributable.OOBTimer [Server:ha-node-2] 15:25:10,967 INFO [org.jboss.playground.ejb.TestTimer] (ServerService Thread Pool -- 22) INIT: going to cancel [Server:ha-node-1] 15:25:25,891 ERROR [org.infinispan.interceptors.impl.InvocationContextInterceptor] (non-blocking-thread-primary:ha-node-1-p7-t13) ISPN000136: Error executing command GetKeyValueCommand on Cache 'playground.war.TestTimer.PERSISTENT', wr iting keys []: org.infinispan.util.concurrent.TimeoutException: ISPN000299: Unable to acquire lock after 15 seconds for key InfinispanTimerMetaDataKey(08079afb-210a-46a9-bd07-d6d86e183408) and requestor GlobalTransaction{id=9, addr=primary:ha-node-1, rem ote=false, xid=null, internalId=-1}. Lock is held by GlobalTransaction{id=8, addr=primary:ha-node-1, remote=false, xid=null, internalId=-1} [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.DefaultLockManager$KeyAwareExtendedLockPromise.get(DefaultLockManager.java:299) [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.DefaultLockManager$KeyAwareExtendedLockPromise.get(DefaultLockManager.java:229) [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.InfinispanLock$LockPlaceHolder.checkState(InfinispanLock.java:440) [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.InfinispanLock$LockPlaceHolder.lambda$toInvocationStage$3(InfinispanLock.java:416) [Server:ha-node-1] at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642) [Server:ha-node-1] at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478) [Server:ha-node-1] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [Server:ha-node-1] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [Server:ha-node-1] at org.wildfly.clustering.context@1.1.2.Final//org.wildfly.clustering.context.ContextualExecutor$1.execute(ContextualExecutor.java:180) [Server:ha-node-1] at org.wildfly.clustering.context@1.1.2.Final//org.wildfly.clustering.context.ContextualExecutor.execute(ContextualExecutor.java:31) [Server:ha-node-1] at org.wildfly.clustering.context@1.1.2.Final//org.wildfly.clustering.context.Contextualizer$2$1.run(Contextualizer.java:220) [Server:ha-node-1] at java.base/java.lang.Thread.run(Thread.java:829) [Server:ha-node-1] [Server:ha-node-1] 15:25:26,022 ERROR [org.infinispan.interceptors.impl.InvocationContextInterceptor] (non-blocking-thread-primary:ha-node-1-p7-t4) ISPN000136: Error executing command LockControlCommand on Cache 'playground.war.TestTimer.PERSISTENT', wri ting keys []: org.infinispan.util.concurrent.TimeoutException: ISPN000299: Unable to acquire lock after 15 seconds for key InfinispanTimerMetaDataKey(08079afb-210a-46a9-bd07-d6d86e183408) and requestor GlobalTransaction{id=24, addr=primary:ha-node-2, rem ote=true, xid=null, internalId=-1}. Lock is held by GlobalTransaction{id=8, addr=primary:ha-node-1, remote=false, xid=null, internalId=-1} [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.DefaultLockManager$KeyAwareExtendedLockPromise.get(DefaultLockManager.java:299) [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.DefaultLockManager$KeyAwareExtendedLockPromise.get(DefaultLockManager.java:229) [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.InfinispanLock$LockPlaceHolder.checkState(InfinispanLock.java:440) [Server:ha-node-1] at org.infinispan.core@14.0.31.Final//org.infinispan.util.concurrent.locks.impl.InfinispanLock$LockPlaceHolder.lambda$toInvocationStage$3(InfinispanLock.java:416) [Server:ha-node-1] at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642) [Server:ha-node-1] at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478) [Server:ha-node-1] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [Server:ha-node-1] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [Server:ha-node-1] at org.wildfly.clustering.context@1.1.2.Final//org.wildfly.clustering.context.ContextualExecutor$1.execute(ContextualExecutor.java:180) [Server:ha-node-1] at org.wildfly.clustering.context@1.1.2.Final//org.wildfly.clustering.context.ContextualExecutor.execute(ContextualExecutor.java:31) [Server:ha-node-1] at org.wildfly.clustering.context@1.1.2.Final//org.wildfly.clustering.context.Contextualizer$2$1.run(Contextualizer.java:220) [Server:ha-node-1] at java.base/java.lang.Thread.run(Thread.java:829)