Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
514 views
in Technique[技术] by (71.8m points)

java - NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less

I have seen many answers related to this error, but all re-directing to scala versions etc. But I think my case is different.

I have a remote spark master-worker cluster set up with version 2.10. I was able to verify it through http://master-ip:8080 listing all worker nodes

From my application, I am trying to create SparkConf with Java 7 code. Following below is the code

sparkConf = new SparkConf(true)
                .set("spark.cassandra.connection.host", "localhost")
                .set("spark.cassandra.auth.username", "username")
                .set("spark.cassandra.auth.password", "pwd")
                .set("spark.master", "spark://master-ip:7077")
                .set("spark.app.name","Test App");

Following are the maven dependencies i added

<dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.10</artifactId>
            <version>2.0.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.1.0</version>
            <exclusions>
                <exclusion>
                    <groupId>javax.validation</groupId>
                    <artifactId>validation-api</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

I get the below error

Caused by: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
    at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1710)
    at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)

Spark Version from one of the worker nodes

./spark-shell --version
Welcome to
     ____              __
    / __/__  ___ _____/ /__
   _ / _ / _ `/ __/  '_/
  /___/ .__/\_,_/_/ /_/\_   version 2.1.0
     /_/

Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_91
Branch 
Compiled by user jenkins on 2016-12-16T02:04:48Z
Revision 
Url 
Type --help for more information.
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

It is related to Scala version.

Your cluster has Scala 2.10, but Spark dependency is

spark-core_2.11

which means Scala 2.11

Change it to 2.10 and will work


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...