score:1
I do not know if you want a generic solution but in your particular case, you can code something like this:
spark.read.json(newDF.as[String])
.withColumn("RPM", explode(col("RPM")))
.withColumn("E", col("RPM.E"))
.withColumn("V", col("RPM.V"))
.drop("RPM")
.show()
Source: stackoverflow.com
Related Query
- Convert Array[Byte] to JSON format using Spark Scala
- Flatten any nested json string and convert to dataframe using spark scala
- Convert a row-List Cassandra table to a JSON format using scala
- Transform structured data to JSON format using Spark Scala
- Convert spark dataframe to json using scala
- Convert a string variable in nested JSON to datetime using Spark Scala
- Add a new line in front of each line before writing to JSON format using Spark in Scala
- How can i create a dataframe from a complex JSON in string format using Spark scala
- How to Convert Spark RDD into JSON using Scala Language
- How to Convert spark dataframe to nested json using spark scala dynamically
- convert a column with json value to a data frame using scala spark
- Convert List to a specific JSON format in Play framework using Scala
- Convert json to dataframe using Apache Spark with Scala
- How to convert Row to json in Spark 2 Scala
- Convert string to timestamp for Spark using Scala
- How to read json data using scala from kafka topic in apache spark
- Re-using A Schema from JSON within a Spark DataFrame using Scala
- Convert RDD of Vector in LabeledPoint using Scala - MLLib in Apache Spark
- Convert Matrix to RowMatrix in Apache Spark using Scala
- Apache Spark: Convert column with a JSON String to new Dataframe in Scala spark
- Convert an org.apache.spark.mllib.linalg.Vector RDD to a DataFrame in Spark using Scala
- Scala convert tuple to JSON using Jerkson
- How to specify only particular fields using read.schema in JSON : SPARK Scala
- Read JSON inside a text file using spark and Scala
- convert scala object to json using json4s
- Convert spark decision tree model debug string to nested JSON in scala
- Convert multiple columns into a column of map on Spark Dataframe using Scala
- Scala Spark How can I convert a column array[string] to a string with JSON array in it?
- Converting DataSet to Json Array Spark using Scala
- Fetch all values irrespective of keys from a column of JSON type in a Spark dataframe using Spark with scala
More Query from same tag
- Simplify two filters in Scala
- How do I set Scala 2.10.3 Docs API to be used in Intellij IDEA 12?
- Optimalization of recursive function
- Converting empty string to a None in Scala
- Apache Spark: Scala XML transformation not serializable?
- Best way to extract a node and all of its children using scala.xml.pull?
- How to extract multiple string part from string in scala?
- Scheduled Executor in Scala
- Spark Excel Write options
- Runtime resolution of type arguments using scala 2.10 reflection
- Cannot use Scala class in Java
- Concurrent map/foreach in scala
- I want to shorten the route in Akka Http
- Scala: how to flatten a List of a Set of filepaths
- How to do OAuth authentication from Java/Scala when we know username/password?
- How to use Scala XML with Apache Flink?
- Type incorrectly inferred
- How to check for null in a single statement in scala?
- Improving performance of distinct + groupByKey on Spark
- Loaded html requests static content from parent directory
- Strings maintaining leading and trailing quotes from JSON when using Jerkson
- java.lang.NoClassDefFoundError: org/apache/logging/log4j/Logger
- SBT sourceGenerators task - execute only if a file changes
- Deploying Play 2.3 on production
- regex to get String between second underscore and last underscore
- Scala: How to get back the key itself when a Map does not contain it
- What is the default `apply` method of an `object`?
- Remove rows with missing values denoted by '?'
- java.lang.IllegalArgumentException when using Play framework 2.3
- Nested Default Maps in Scala