score:3
Accepted answer
One option
case class Nested(D: Long, E: String)
case class TestClass (A: Long, B:String, C: Seq[Nested])
Usage:
spark.read.json(sc.parallelize(
Seq("""{"A": 123, "B": "Hello world", "C": [{"D": 123, "E": "Spark"}]}"""
))).as[TestClass].show
+---+-----------+-------------+
| A| B| C|
+---+-----------+-------------+
|123|Hello world|[[123,Spark]]|
+---+-----------+-------------+
Source: stackoverflow.com
Related Query
- How to set the type of array with dataset in spark scala
- How do you properly set up Scala Spark libraryDependencies with the correct version of Scala?
- What is the efficient way to create Spark DataFrame in Scala with array type columns from another DataFrame that does not have an array column?
- Retrieve the struct from array of structs when struct field of struct type matches with specific value in spark scala
- How do I run the Spark decision tree with a categorical feature set using Scala?
- Scala spark: how to use dataset for a case class with the schema has snake_case?
- Scala - Spark - How to transform a dataframe containing one string column to a DF with columns with the rigth type?
- How to see the type of a scala variable? For instance a Spark PairRDD
- How to set resource requirements on a container with the fabric8 kubernetes Java & Scala client API
- How to convert a spark DataFrame with a Decimal to a Dataset with a BigDecimal of the same precision?
- Scala - how come using a super-type with two generic parameters cause the scala type checker to treat the child-type differently?
- How to read from textfile(String type data) map and load data into parquet format(multiple columns with different datatype) in Spark scala dynamically
- Scala Spark How can I convert a column array[string] to a string with JSON array in it?
- How to look up the type of a method returning a type parameter with scala reflection?
- How do I explode multiple columns of arrays in a Spark Scala dataframe when the columns contain arrays that line up with one another?
- How to use withColumn with condition for the each row in Scala / Spark data frame
- How to create your own type with closed set of values, in Scala
- How to do type conversion with the Scala Camel DSL
- how to check if schema.field.dataype is array of strings in scala with spark
- How do I get the type signature to agree with the scala compiler when extracting a value out of an Option[A]?
- How to interpret methods with <A,U> RDD<U> in the Scala Spark docs?
- Compare rows of an array column with the headers of another data frame using Scala and Spark
- How to set the value of a variable based on the generic type - Scala
- How to manually create a Dataset with a Set column in Scala
- Scala classOf for type parameter: how to call the function and how to restrict with upper type bounds
- How to extract week day as a number from a Spark dataframe with the Scala API
- How to convert a simple DataFrame to a DataSet Spark Scala with case class?
- How to change the element type of a stream with map in Scala
- How to know which is the RDD type inferred by Spark using Scala
- How to create schema in Spark with Scala if more than 100 columns in the input?
More Query from same tag
- Why is logback loading configurations in a different order and ignoring system properties (SBT)?
- How can I use this Java function in Scala in a UDF?
- Natural/Human Ordering String (Case Class with String)
- Semantic Rules in Scala
- How to deploy scala files used in spark-shell on cluster?
- How to turn `Either[Error, Option[Either[Error, Account]]]` to `Either[Error, Option[Account]]` with typelevel cats?
- Executing a list of functions with early termination
- SBT plugin -- execute custom task before compilation
- Spark map reduce with condition
- monoid vs monad in Scala
- Optional Json Body Parser
- How to add columns into org.apache.spark.sql.Row inside of mapPartitions
- Does scala specification 2.10 and 2.11 exist?
- Understanding a sbt error when executing the package command
- How to use the RangePartitioner in Spark
- Scala Anorm. Insert exception; parameter index out of range
- sum previous row value in current row in spark scala
- Relationship between instance of classes and its companion object
- Can I use JSON4S hashCode method as a key?
- Sbt can't find SbtScalariform
- How to get COUNT for same table in different databases with table name in SELECT statement?
- Compilation failing with pattern matching on Number type
- FS2 parallel evaluation of items with different discriminator
- Spark Dataframe transformation
- unexcepted exception-IOException:invalid constant type 117 at 283
- getting the index of partial tuple in a list
- Manually marking flyway migration as completed
- How do I include ScalaDoc options in Maven
- Need help understanding Scala Futures with Scrimage image package
- How to get input from Scala after a certain point in time?