score:5

Accepted answer

As @groverboy advised in the comment you should really be using org.apache.spark.SparkContext instead. Spark Programming Guide's Initializing Spark is also clear on this.

import org.apache.spark._

val conf = new SparkConf()
  .setMaster("local[4]")
  .setAppName("Twitter Analyzer")
  .setSparkHome("/home/welcome/Downloads/spark-1.1.0/")
  .setJars(Seq("target/scala-2.10/Simple-assembly-0.1.0.jar"))
val sc = new SparkContext(conf)

The reason for this is the type inference in Scala that needs type context to infer the type of line parameter.

val numAs = twitterFeed.filter(line => line.contains(value))

It's clearly of String type, but using the Java version of SparkContext - JavaSparkContext - you simply lose the type information.

Provided you use SparkContext the above line could be further simplified to:

val numAs = twitterFeed.filter(_.contains(value))

or even:

twitterFeed.filter(_ contains value)

All the goodies just SparkContext away.

score:4

val numAs =  twitterFeed.filter((i: String) => i.contains(value))

resolved the problem.


Related Query

More Query from same tag