Uploaded image for project: 'Red Hat Data Grid'
  1. Red Hat Data Grid
  2. JDG-29

JDG-Spark connector should be built against Scala 2.10 by default

XMLWordPrintable

      Spark is built against Scala 2.10, while JDG-Spark connector is built against Scala 2.11. It results into following exception when running in Spark:

      Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
              at org.infinispan.spark.rdd.PerServerSplitter.split(Splitter.scala:43)
              at org.infinispan.spark.rdd.InfinispanRDD.getPartitions(InfinispanRDD.scala:81)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
              at scala.Option.getOrElse(Option.scala:120)
              at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
              at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
              at scala.Option.getOrElse(Option.scala:120)
              at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
              at org.apache.spark.SparkContext.runJob(SparkContext.scala:1921)
              at org.apache.spark.rdd.RDD.count(RDD.scala:1125)
              at org.infinispan.demo.TextSearch$.main(TextSearch.scala:21)
              at org.infinispan.demo.TextSearch.main(TextSearch.scala)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:497)
              at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
              at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
              at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
              at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
              at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      

      If we provide only one version of the connector, I believe it should be one for Scala 2.10 to be on the same version as Spark bits.

              vdedik Vaclav Dedik (Inactive)
              vjuranek@redhat.com Vojtech Juranek
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: