Uploaded image for project: 'Hybrid Cloud Console'
  1. Hybrid Cloud Console
  2. RHCLOUD-24067

String-empty or malformed CE extras breaks integration

XMLWordPrintable

      Description of problem

      If the extras in a CloudEvent data is an empty string or a malformed JSON, it breaks the integration. Fails in CloudEventDecoder.

      2022-03-02 15:40:15,329 WARN  [org.apa.cam.com.kaf.KafkaConsumer] (Camel (redhat-splunk-quarkus) thread #0 - KafkaConsumer[platform.notifications.tocamel]) Error during processing. Exchange[B362560E632E10C-0000000000000002]. Caused by: [org.apache.camel.util.json.DeserializationException - The unexpected token END() was found at position 0. Fix the parsable string and try again.]: org.apache.camel.util.json.DeserializationException: The unexpected token END() was found at position 0. Fix the parsable string and try again.
      	at org.apache.camel.util.json.Jsoner.deserialize(Jsoner.java:192)
      	at org.apache.camel.util.json.Jsoner.deserialize(Jsoner.java:112)
      	at org.apache.camel.util.json.Jsoner.deserialize(Jsoner.java:362)
      	at com.redhat.console.notifications.splunkintegration.CloudEventDecoder.process(CloudEventDecoder.java:34)
      	at org.apache.camel.support.processor.DelegateSyncProcessor.process(DelegateSyncProcessor.java:66)
      	at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$RedeliveryTask.doRun(RedeliveryErrorHandler.java:804)
      	at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$RedeliveryTask.run(RedeliveryErrorHandler.java:712)
      	at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:179)
      	at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:64)
      	at org.apache.camel.processor.Pipeline.process(Pipeline.java:184)
      	at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:398)
      	at org.apache.camel.impl.engine.DefaultAsyncProcessorAwaitManager.process(DefaultAsyncProcessorAwaitManager.java:83)
      	at org.apache.camel.support.AsyncProcessorSupport.process(AsyncProcessorSupport.java:41)
      	at org.apache.camel.component.kafka.KafkaConsumer$KafkaFetchRecords.doPollRun(KafkaConsumer.java:403)
      	at org.apache.camel.component.kafka.KafkaConsumer$KafkaFetchRecords.doRun(KafkaConsumer.java:286)
      	at org.apache.camel.component.kafka.KafkaConsumer$KafkaFetchRecords.run(KafkaConsumer.java:249)
      	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
      	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
      	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
      	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
      	at java.base/java.lang.Thread.run(Thread.java:829)
      

      How reproducible

      Always

      Steps to Reproduce:

      1. Set Integration within CRC with extras having an empty string or just a space
      2. Send an event
      3. Observe pod logs
      4. Repeat for extras with malformed JSON value, for example an unclosed JSON object.

      Actual results

      No event lands in Splunk. Integration breaks with above stacktrace.

      Expected results

      Events land in Splunk despite having malformed extras in Integration settings.

      Additional info

            Unassigned Unassigned
            vkrizan1@redhat.com Viliam Krizan
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: