score:25
There are many ways to do this with the Play JSON Library. The main difference is the usage of Scala case class or not.
Given a simple json
val json = Json.parse("""{"people": [ {"name":"Jack", "age": 19}, {"name": "Tony", "age": 26} ] }""")
You can use case class and Json Macro to automatically parse the data
import play.api.libs.json._
case class People(name: String, age: Int)
implicit val peopleReader = Json.reads[People]
val peoples = (json \ "people").as[List[People]]
peoples.foreach(println)
Or without case class, manually
import play.api.libs.json._
import play.api.libs.functional.syntax._
implicit val personReader: Reads[(String, Int)] = (
(__ \ "name").read[String] and
(__ \ "age").read[Int]
).tupled
val peoples = (json \ "people").as[List[(String, Int)]]
peoples.foreach(println)
In other words, check the very complete documentation on this subject :) http://www.playframework.com/documentation/2.1.0/ScalaJson
score:4
If you don't have the object type or don't want to write a Reads, you can use .as[Array[JsValue]]
val jsValue = Json.parse(text)
val list = (jsValue \ "people").as[Array[JsValue]]
Then
list.foreach(a => println((a \ "name").as[String]))
In the older version (2.6.x) it is possible to use .as[List[JsValue]]
but newer versions only support Array.
Source: stackoverflow.com
Related Query
- Parsing JSON in Play2 and Scala without Data Type
- Parsing and manipulating json in Scala
- Scala inconsistence behavior for final vals with and without type ascription
- How do I exclude values when doing json Reads and Writes in my Play2 scala application
- Parsing JSON and iterating through the object in Scala
- how do i convert json with mixed type values to json with scala and json spray
- Type classes vs Data Types in functional Libraries in scala such as Cats and ScalaZ
- Play Framework and Scala Json, parsing for json containing JSArray and JSObject
- How to read from textfile(String type data) map and load data into parquet format(multiple columns with different datatype) in Spark scala dynamically
- How to define new primitive data type and its behavior in Scala
- scala spark convert a struct type column to json data
- scala read json and extract required column data
- Read data from MongoDB and apply Filter on Multilevel JSON in Scala Spark Connector
- Play2 and Scala (Jerkson) - List of JsObjects - how to convert to a Json object
- How to format and output table and constant data in a Scala Dateframe as JSON objects
- Creating Schema of JSON type and Reading it using Spark in Scala [Error : cannot resolve jsontostructs]
- Issue with parsing JSON data stored as string in Scala variable
- Parsing large JSON file with Scala and JSON4S
- Finding common strings from map type and array Scala without loops
- Trying to understand the Range collection in Scala and why errors arise upon assigning the data type in Scala v2.13 as opposed to v2.11
- Scala get JSON value in a specific data type on map object
- Parsing Nested JSON Data using SCALA
- Getting value from JSON and processing depending on variable type - Scala
- How do you combine data in a Scala dataframe and output it as JSON objects?
- Why Scala can't inferred type of method without parentheses and dots
- Importing "stem and plot" type data into scala and storing in tuples
- Loading and Parsing JSON with Spark using Scala Jackson library
- Parse JSON data with Apache Spark and Scala
- Consuming Server Sent Events(SSE) without losing data using scala and Akka
- How to create new empty column in Spark by Scala with just column name and data type
More Query from same tag
- how to avoid collect when using Statistic.stat
- Spark performance with a small dataset
- How to access an Enum object inside a scala class
- Type equivalence issue when using dependent method types and the type projections
- Different Equality[DenseVector] types for different numerical tolerances
- How to configure GitLab CI job for running Gatling tests integrated with sbt?
- Play 2.5: Depedency injection in templates
- How to add custom JDBC dialects in PySpark
- Comparing 2 Scala 2D arrays: getting error: value sameElements is not a member of (String, String)
- scala - mimic unix style printing of variables
- akka-http: send element to akka sink from http route
- Not able to lookup a custom dispatcher
- Spark SQL - Nested array conditional select
- Calculate difference in hours between two date time values
- Ambiguous Class level and inherited method level ClassTag
- Scala local return?
- Implementing a Scala trait by passing functions as arguments
- Can't import scala.reflect.runtime.universe
- Is the take function well defined on a Scala Map?
- Spark update value in the second dataset based on the value from first dataset
- What is the best practice for managing subscriptions in RxScala?
- Play framework handling session state
- SBT finding subtype in Sources
- Spark job with Async HTTP call
- comparing Scala lists with Java lists
- What mechanism use an Akka Actor to persist its messages?
- OpenJFX application fails to launch using Scala
- Word wrapping inside a TitledPane that is inside VBox
- How can I determine if Akka bytestring contains a given substring?
- spark-submit cannot refer jars specified by "--jars" while the code works in the Spark shell?