Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
384 views
in Technique[技术] by (71.8m points)

java - Spark 2.0.0 Arrays.asList not working - incompatible types

Following code works with Spark 1.5.2 but not with Spark 2.0.0. I am using Java 1.8.

final SparkConf sparkConf = new SparkConf();
sparkConf.setMaster("local[4]"); // Four threads
final JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);
final JavaRDD<String> javaRDDLines = javaSparkContext.textFile("4300.txt");
final JavaRDD<String> javaRDDWords = javaRDDLines.flatMap(line -> Arrays.asList(line.split(" ")));

I get following error

Error:(46, 66) java: incompatible types: no instance(s) of type variable(s) T exist so that java.util.List<T> conforms to java.util.Iterator<U>

I am unable to figure out if the Spark API has changed or something else. Please help. Thanks.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

In 2.0, FlatMapFunction.call() returns an Iterator rather than Iterable. Try this:

JavaRDD<String> javaRDDWords = javaRDDLines.flatMap(line -> Arrays.asList(line.split(" ")).iterator())

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...