score:15
Accepted answer
It's an initialization order problem. You need to define Reads[Hits]
before Reads[SearchLikeThisResult]
. It compiles because the symbol exists in the code, when when Reads[SearchLikeThisResult]
is initialized, Reads[Hits]
is not. It goes unnoticed until it attempts to parse the array of Hits
, and hits the NPE.
So just swap the order. This is related to this answer.
implicit val hitsReads: Reads[Hits] = (
(JsPath \ "_index").read[String] and
(JsPath \ "_type").read[String] and
(JsPath \ "_id").read[String] and
(JsPath \ "_score").read[Double]
)(Hits.apply _)
implicit val searchLikeThisResult: Reads[SearchLikeThisResult] = (
(JsPath \ "total").read[Int] and
(JsPath \ "max_score").read[Double] and
(JsPath \ "hits").read[Seq[Hits]]
)(SearchLikeThisResult.apply _)
Source: stackoverflow.com
Related Query
- Playframework JSON parsing - Null pointer exception - when array is present
- Play 2.3 implicit json conversion causes null pointer exception
- Spark non-serializable exception when parsing JSON with json4s
- null pointer exception when overriding a parameterless method with a field
- Getting null pointer exception when running saveAsNewAPIHadoopDataset in scala spark2 to hbase
- Null pointer exception occured only when running the jar file - Scala Spark
- Null Pointer Exception when using forEach Iterator on dataframe in spark
- Play framework Scala json validation null pointer exception
- parsing a Json Array in play framework JsObject
- No content to map due to end-of-input when parsing json
- How setup spray-json to set null when json element is not present?
- Spark give Null pointer exception during InputSplit for Hbase
- null.==(obj) does not throw null pointer exception in scala
- Play Framework 2.2 Null Pointer Exception ArrayList
- NoSuchMethod exception in Flink when using dataset with custom object array
- Spark 2.2 Null-safe Left Outer Join Null Pointer Exception
- Spark error when accessing an empty or null array
- How do I throw an error when an unknown field is present whilst reading JSON with Scala Play?
- Parsing null values in JSON using json4s
- How do i handle the null pointer exception error in the below udf
- Raise exception while parsing JSON with the wrong schema on the Optional field
- Custom defined form Constraint throws a null pointer exception
- Scala null pointer exception cause by .toString on a variable which is not null
- Parse json array to a case class in scala using playframework with the fields in json not matching the fields in case class
- null pointer exception while converting dataframe to list inside udf
- Playframework - JSON parsing object with single field - definition issue
- How to use different names when mapping JSON array to Scala object using combinators
- How to dynamically select class type when parsing JSON to object in Play Framework?
- Play Framework: How to Validate Optional Fields When Parsing JSON
- Why is this JSON giving null values when I deserialize and try to extract values
More Query from same tag
- Ordering in scala
- Can I create a proto jar for scalaVersion 2.11/2.12 and use it within the same sbt under different sub-project?
- How does type Dynamic work and how to use it?
- User Defined Variables in spark - org.apache.spark.SparkException: Task not serializable
- Flatten Spark Dataframe and Name Columns
- JUnit 5: How to assert an exception is thrown in Scala?
- how to define features column in spark ml
- Play Framework: URISyntaxException when passing special character from coffeescript
- sbt - Scala - Brew - MacOS - Unable to connect to sbt shell at all: NoClassDefFoundError
- Why I can create instance of case class in scala without new operator?
- Convert Iterable[Try[U]] to Try[Iterable[U]] in Scala
- How can I make a private member object-encapsulated?
- Call play2 template with variable arguments of type Html
- Get columns that correspond to when clause
- Playframework Scala Transform Request from Json to x-www-form-urlencoded
- type mismatch; found : newValue.type (with underlying type S) required: T
- How to clear state of one notebook without affecting other notebook using command in Azure Databricks?
- Scala pattern match the same type parameter multiple times
- Scalatest to check if a map contains values form a list
- Access Array column in Spark
- Converting an imperative solution to concurrent map solution
- Understanding implicits and subscribe in mongodb Scala src
- How to convert map values using Spark built-in functions?
- Passing sparkSession Between Scala Spark and PySpark
- How to check if list contains all the same values?
- How to periodically execute an AKKA Actor's routine?
- Scala share resources folder to open a file
- scala-arm. Return type
- How to make $filter operator inside a projection of an aggregation using the MongoDB Scala Driver
- How can an object in Scala be cast to the intersection of its own type and any unrelated trait?