import org.apache.spark._ val conf = new SparkConf() .setMaster("local") .setAppName("Twitter Analyzer") .setSparkHome("/home/welcome/Downloads/spark-1.1.0/") .setJars(Seq("target/scala-2.10/Simple-assembly-0.1.0.jar")) val sc = new SparkContext(conf)
The reason for this is the type inference in Scala that needs type context to infer the type of
val numAs = twitterFeed.filter(line => line.contains(value))
It's clearly of
String type, but using the Java version of SparkContext - JavaSparkContext - you simply lose the type information.
Provided you use
SparkContext the above line could be further simplified to:
val numAs = twitterFeed.filter(_.contains(value))
twitterFeed.filter(_ contains value)
All the goodies just
val numAs = twitterFeed.filter((i: String) => i.contains(value))
resolved the problem.
- Why does the Scala compiler fail with missing parameter type for filter with JavaSparkContext?
- Why does the type parameter bound T <: Comparable[T] fail for T = Int?
- Why does Scala implicit resolution fail for overloaded method with type parameter?
- Why does scala compiler fail to find implicit parameter value/conversion when it is an overload and has generic type param?
- In Scala, why does a type annotation must follow for the function parameters ? Why does the compiler not infer the function parameter types?
- Why does the Scala compiler disallow overloaded methods with default arguments?
- What special rules does the scala compiler have for the unit type within the type system
- Why does Scala compiler fail with "no ': _*' annotation allowed here" when Row does accept varargs?
- Why does Scala choose the type 'Product' for 'for' expressions involving Either and value definitions
- Why does Scala evaluate the argument for a call-by-name parameter if the method is infix and right-associative?
- Why does Scala compiler for .NET ignore the meaning of val?
- Why does Scala not infer the type parameters when pattern matching with @
- Why does the Scala compiler error with "Synthetic tree contains nonsynthetic tree"?
- Why does the Scala compiler give "value registerKryoClasses is not a member of org.apache.spark.SparkConf" for Spark 1.4?
- Why can't the compiler infer type parameter for package object (while it works fine for object)?
- Why does Scala compiler fail with "object SparkConf in package spark cannot be accessed in package org.apache.spark"?
- Why does the compiler infer incorrectly return type for Future.sequence declaration in function?
- Why scala cannot infer common return type for a function with multiple parameter lists?
- Why does apply with function parameter fail for println and not print?
- Why does combining a custom WrappedRequest with extra type parameter and ActionFilter cause the type to be lost?
- Scala file writing with Java printwriter - why does the file writer stop in this code?
- Why does reading stream from Kafka fail with "Unable to find encoder for type stored in a Dataset"?
- Why does activator fail with "Could not retrieve Scala 2.11.5: missing scala.Option, scala.tools.nsc.Global" at startup?
- Why does Scala compiler forbid declaration of a wildcard type as super type of a type parameter
- Why does the Scala compiler recognize a generic type twice
- Scala missing parameter type for expanded function The argument types of an anonymous function must be fully known. (SLS 8.5)
- Why does sbt fail with sbt.ResolveException: unresolved dependency for Spark 2.0.0 and Scala 2.9.1?
- When and why does the collect method fail for filtering Scala Collections by subtype?
- Couchbase scala observable n1ql query results with missing parameter type for expanded function ((x$12) => x$12.rows())
- Why does creating a DataFrame fail with "java.lang.UnsupportedOperationException: Schema for type Category is not supported"
More Query from same tag
- for comprehension in scala for implementing flatMap
- Generic function with implicit parameter
- Play 2.6x Scala app does not log despite logback configuration
- Kinds not conforming with type lambda
- Scala java.io toArray error when zipping feature importance vector to column names array
- Why must lagom services have two projects?
- How to Typecheck a DefDef
- How to write contents of RDD onto S3?
- Estimating possible # of actors in Scala
- Validation of mapping form with tuple of checkboxes in it
- For comprehension with Future[Option[T]] and executing subsequent futures based on conditions
- How to add a sort condition to a Spark Dataframe
- How to indicate in a human readable way that a function is returned within a function in scala?
- Scala, currying and overloading
- Scala gui doesn't work
- Shared Linked List in Scala
- Why val is stable identifier and def is not?
- Scala child class fails to recognize a generic map in the parent?
- Scala generate arbitrary instance of parametrized case class
- Generic method which works with RDD and Seq
- Spark unable to find "spark-version-info.properties" when run from ammonite script
- jquery ajax request error on spring scala mvc
- String concatenation gone functional
- Enforce a lower-bounds that "excludes" a super type
- Join nested dataframes Spark Scala
- Argument destructuring
- How to instruct sbt 0.11.x which source files contain tests
- Is there a way to dynamically add constraints in Scala?
- How to kill a spark application gracefully
- How to parse UTC timestamp of format yyyy-MM-ddTHH:mm:ss.SSS+ZZZZ