The error message says that the
Encoder is not able to take the
Person case class.
Error:(15, 67) Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing sqlContext.implicits._ Support for serializing other types will be added in future releases.
Move the declaration of the case class outside the scope of
Define Person case classes outside object SimpleApp. Also, add import sqlContext.implicits._ inside main() function.
You have the same error if you add
SimpleApp (the order doesn't matter).
Removing one or the other will be the solution:
val spark = SparkSession .builder() .getOrCreate() val sqlContext = spark.sqlContext import sqlContext.implicits._ //sqlContext OR spark implicits //import spark.implicits._ //sqlContext OR spark implicits case class Person(age: Long, city: String) val persons = ctx.read.json("/tmp/persons.json").as[Person]
Tested with Spark 2.1.0
The funny thing is if you add the same object implicits twice you will not have problems.
- Why is the error "Unable to find encoder for type stored in a Dataset" when encoding JSON using case classes?
- Why is "Unable to find encoder for type stored in a Dataset" when creating a dataset of custom case class?
- Spark Error: Unable to find encoder for type stored in a Dataset
- Unable to find encoder for type stored in a Dataset in Spark Scala
- Unable to find encoder for Decimal type stored in a DataSet
- Unable to find encoder for type stored in a Dataset for streaming mongo db data through Kafka
- Unable to find encoder for type stored in a Dataset
- Why does reading stream from Kafka fail with "Unable to find encoder for type stored in a Dataset"?
- Unable to find encoder for type AccessLog. An implicit Encoder[AccessLog] is needed to store AccessLog instances in a Dataset
- Unable to find encoder for type stored in a Dataset. in spark structured streaming
- Why is Scala unable to infer the type of an anonymous function when there is method overloading and only one method takes a function?
- How to create Dataset with case class Type Parameter ? (Unable to find encoder for type T)
- "Unable to find encoder for type stored in a Dataset" even spark.implicits._ is imported?
- "Unable to find encoder for type stored in a Dataset" and "not enough arguments for method map"?
- Why do I get a type error in model.predictOnValues when I try the official example of Streaming Kmeans Clustering of Apache Spark?
- Spark: Unable to find encoder for type Unit
- Error: Unable to find encoder for type org.apache.spark.sql.Dataset[(String, Long)]
- Scala - Encoder missing for type stored in dataset
- Why does this Scala function compile when the argument does not conform to the type constraint?
- Why does the type parameter bound T <: Comparable[T] fail for T = Int?
- Why does Scala choose the type 'Product' for 'for' expressions involving Either and value definitions
- Scala 2.7.x type mismatch error when passing null for a reference type
- How does HList.foldRight look for implicits when used in the implementation of a type class?
- Why are the bounds of type parameters ignored when using existential types in Scala?
- When using scala path dependent type as function codomain, why is it impossible to add alias for that function?
- Why a encoder is needed for creating dataset in spark
- Why does type inference chooses only most specific type of the target reference when looking at implicit conversions?
- Why parameter type is needed for expanded function when there are more than 1 parameter?
- Spray is rejecting my request for the wrong reason (405 but should be 400) when a query parameter on a PUT is the wrong type
- Scala 2.10 reflection: Why do I get the same type "List()" for the list and list element?
More Query from same tag
- Scala: Unspecified value parameter error when trying to create binary tree using fold on sequence of elements
- Sending i18n message as a parameter to a template inside another template
- why spark sort is slower than scala original sort method
- In scala, how do you transform a list of futures into a future that returns the first successful future?
- How to create new columns in dataframe using Spark Scala based on different string patterns
- Scala Twenty Two Again
- org.apache.commons.math3.linear.SingularMatrixException: matrix is singular
- When does Scala need parameter types for anonymous and expanded functions?
- How to pass a code block to function?
- Is there a Scala pattern for nested generics allowing my inner generic type to have polymorphism?
- Type mismatch runtime error on ClassTag (Scala)
- spark rdd time stamp conversion
- Scala "fluent" style programming
- Convert deprecated sbt build file
- Scala HashSet or HashMap to implement a cache of Strings?
- Scala Actor Based Web Server
- Build sbt project as a part of Maven project
- How to set up multi-module build with reference to external local sbt project?
- Compilation error: not found: value nonEmptyText in Play framework while using Scala
- Scala array iteration unformated output
- Look up value in an array
- Spark SQL lazy count
- How to remove numbers from text in Scala?
- Work on list of tuples in Scala - part 1
- Error message with Scala AKKA Actor
- Sandboxing and monitoring of Akka actors
- Function to shift a range
- Benchmarking: Why is Play (Scala) throughput-latency curve not coming flat?
- scalike-jdbc - How to specify column alias in query?
- Spark :How to generate file path to read from s3 with scala