-
Bug
-
Resolution: Done
-
Major
-
None
-
None
-
False
-
-
False
-
-
Description of the problem:
After deploying the orchestrator with Helm chat 0.1.7, deployed workflows don't show up in the orchestrator UI. After checking the greeting workflows logs, it appears that the workflow can't reach the data index sevice:
2024-01-21 17:44:47,257 ERROR [org.kie.kog.eve.pro.ReactiveMessagingEventPublisher] (main) Error while creating event to topic kogito-processdefinitions-events for event ProcessDefinitionDataEvent {specVersion=1.0, id='0a51ef62-7e2a-4ace-a8d3-baab0d4e04e3', source=http://greeting.sonataflow-infra/greeting, type='ProcessDefinitionEvent', time=2024-01-21T17:44:47.053927699Z, subject='null', dataContentType='application/json', dataSchema=null, data=org.kie.kogito.event.process.ProcessDefinitionEventBody@45d4421d, kogitoProcessInstanceId='null', kogitoRootProcessInstanceId='null', kogitoProcessId='greeting', kogitoRootProcessId='null', kogitoAddons='null', kogitoIdentity='null', extensionAttributes={kogitoprocid=greeting}}: java.util.concurrent.CompletionException: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: sonataflow-platform-data-index-service.sonataflow-infra/172.30.171.10:80 at io.smallrye.mutiny.operators.uni.UniBlockingAwait.await(UniBlockingAwait.java:79) at io.smallrye.mutiny.groups.UniAwait.atMost(UniAwait.java:65) at io.smallrye.mutiny.groups.UniAwait.indefinitely(UniAwait.java:46) at io.smallrye.reactive.messaging.providers.extension.MutinyEmitterImpl.sendMessageAndAwait(MutinyEmitterImpl.java:57) at org.kie.kogito.events.process.ReactiveMessagingEventPublisher.publishToTopic(ReactiveMessagingEventPublisher.java:124) at org.kie.kogito.events.process.ReactiveMessagingEventPublisher.publish(ReactiveMessagingEventPublisher.java:82) at org.kie.kogito.events.process.ReactiveMessagingEventPublisher.publish(ReactiveMessagingEventPublisher.java:113) at org.kie.kogito.event.impl.BaseEventManager.lambda$publish$0(BaseEventManager.java:58) at java.base/java.lang.Iterable.forEach(Iterable.java:75) at org.kie.kogito.event.impl.BaseEventManager.publish(BaseEventManager.java:58) at org.kie.kogito.services.registry.ProcessDefinitionEventRegistry.register(ProcessDefinitionEventRegistry.java:77) at org.kie.kogito.quarkus.registry.ProcessDefinitionRegistration.onStartUp(ProcessDefinitionRegistration.java:51) at org.kie.kogito.quarkus.registry.ProcessDefinitionRegistration_Observer_onStartUp_523b213446c7bf997366f22082854aa9606d55c7.notify(Unknown Source) at io.quarkus.arc.impl.EventImpl$Notifier.notifyObservers(EventImpl.java:346) at io.quarkus.arc.impl.EventImpl$Notifier.notify(EventImpl.java:328) at io.quarkus.arc.impl.EventImpl.fire(EventImpl.java:82) at io.quarkus.arc.runtime.ArcRecorder.fireLifecycleEvent(ArcRecorder.java:155) at io.quarkus.arc.runtime.ArcRecorder.handleLifecycleEvents(ArcRecorder.java:106) at io.quarkus.deployment.steps.LifecycleEventsBuildStep$startupEvent1144526294.deploy_0(Unknown Source) at io.quarkus.deployment.steps.LifecycleEventsBuildStep$startupEvent1144526294.deploy(Unknown Source) at io.quarkus.runner.ApplicationImpl.doStart(Unknown Source) at io.quarkus.runtime.Application.start(Application.java:101) at io.quarkus.runtime.ApplicationLifecycleManager.run(ApplicationLifecycleManager.java:111) at io.quarkus.runtime.Quarkus.run(Quarkus.java:71) at io.quarkus.runtime.Quarkus.run(Quarkus.java:44) at io.quarkus.runtime.Quarkus.run(Quarkus.java:124) at io.quarkus.runner.GeneratedMain.main(Unknown Source) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at io.quarkus.bootstrap.runner.QuarkusEntryPoint.doRun(QuarkusEntryPoint.java:61) at io.quarkus.bootstrap.runner.QuarkusEntryPoint.main(QuarkusEntryPoint.java:32) Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: sonataflow-platform-data-index-service.sonataflow-infra/172.30.171.10:80 Caused by: java.net.ConnectException: Connection refused at java.base/sun.nio.ch.Net.pollConnect(Native Method) at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:672) at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:946) at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:840)
How reproducible: 100% - but may be just in my specific environment.
Steps to reproduce:
1. Deploy the orchestrator on an OCP cluster with Helm chart 0.1.7
2. Check if the workflow appears in the UI
3. Check the workflow pod's logs
Actual results: All pods are running normally, but the workflow can't reach the dataindex service and the workflow does not appear in the orchestrator UI.
Expected results: Workflow should deploy normally{}
- depends on
-
KOGITO-9888 Add support for publishing workflow definition in the Operator
- Closed
- links to