score:2
I think you need to extend CustomSerializer
class since CustomKeySerializer
it is used to implement custom logic for JSON keys:
import org.json4s.{CustomSerializer, MappingException}
import org.json4s.JsonAST._
import org.json4s.JsonDSL._
import org.json4s.jackson.JsonMethods._
case class SimpleFeature(column: String,
valueType: String,
nullValue: String,
func: Option[String])
case class TaskConfig(simpleFeatures: Option[Seq[SimpleFeature]])
object Main extends App {
implicit val formats = new DefaultFormats {
override val strictOptionParsing: Boolean = true
} + new SimpleFeatureSerializer()
class SimpleFeatureSerializer extends CustomSerializer[SimpleFeature](_ => ( {
case jsonObj: JObject =>
val requiredKeys = Set[String]("column", "valueType", "nullValue")
val diff = requiredKeys.diff(jsonObj.values.keySet)
if (diff.nonEmpty)
throw new MappingException(s"Fields [${requiredKeys.mkString(",")}] are mandatory. Missing fields: [${diff.mkString(",")}]")
val column = (jsonObj \ "column").extract[String]
val valueType = (jsonObj \ "valueType").extract[String]
val nullValue = (jsonObj \ "nullValue").extract[String]
val func = (jsonObj \ "func").extract[Option[String]]
SimpleFeature(column, valueType, nullValue, func)
}, {
case sf: SimpleFeature =>
("column" -> sf.column) ~
("valueType" -> sf.valueType) ~
("nullValue" -> sf.nullValue) ~
("func" -> sf.func)
}
))
// case 1: Test single feature
val singleFeature = """
{
"column": "pcat_id",
"valueType": "categorical",
"nullValue": "DUMMY"
}
"""
val singleFeatureValid = parse(singleFeature).extract[SimpleFeature]
println(singleFeatureValid)
// SimpleFeature(pcat_id,categorical,DUMMY,None)
// case 2: Test task config
val taskConfig = """{
"simpleFeatures": [
{
"column": "pcat_id",
"valueType": "categorical",
"nullValue": "DUMMY"
},
{
"column": "brand_code",
"valueType": "categorical",
"nullValue": "DUMMY"
}]
}"""
val taskConfigValid = parse(taskConfig).extract[TaskConfig]
println(taskConfigValid)
// TaskConfig(List(SimpleFeature(pcat_id,categorical,DUMMY,None), SimpleFeature(brand_code,categorical,DUMMY,None)))
// case 3: Invalid json
val invalidSingleFeature = """
{
"column": "pcat_id",
"value": "categorical",
"nullValue": "DUMMY"
}
"""
val singleFeatureInvalid = parse(invalidSingleFeature).extract[SimpleFeature]
// throws MappingException
}
Analysis: the main question here is how to gain access to the keys of jsonObj
in order to check whether there is an invalid or missing key, one way to achieve that is through jsonObj.values.keySet
. For the implementation, first we assign the mandatory fields to the requiredKeys
variable, then we compare the requiredKeys
with the ones that are currently present with requiredKeys.diff(jsonObj.values.keySet)
. If the difference is not empty that means there is a missing mandatory field, in this case we throw an exception including the necessary information.
Note1: we should not forget to add the new serializer to the available formats.
Note2: we throw an instance of MappingException, which is already used by json4s internally when parsing JSON string.
UPDATE
In order to force validation of Option fields you need to set strictOptionParsing
option to true by overriding the corresponding method:
implicit val formats = new DefaultFormats {
override val strictOptionParsing: Boolean = true
} + new SimpleFeatureSerializer()
Resources
https://nmatpt.com/blog/2017/01/29/json4s-custom-serializer/
https://danielasfregola.com/2015/08/17/spray-how-to-deserialize-entities-with-json4s/
https://www.programcreek.com/scala/org.json4s.CustomSerializer
score:0
You can try using play.api.libs.json
"com.typesafe.play" %% "play-json" % "2.7.2",
"net.liftweb" % "lift-json_2.11" % "2.6.2"
You just need to define case class and formatters.
Example:
case class Example(a: String, b: String)
implicit val formats: DefaultFormats.type = DefaultFormats
implicit val instancesFormat= Json.format[Example]
and then just do :
Json.parse(jsonData).asOpt[Example]
In case the above gives some errors: Try adding "net.liftweb" % "lift-json_2.11" % "2.6.2"
too in your dependency.
Source: stackoverflow.com
Related Query
- Raise exception while parsing JSON with the wrong schema on the Optional field
- How to solve exception "Trying to create a schema with a partial (join) attribute index on the default date field 'dtg'"
- Spark non-serializable exception when parsing JSON with json4s
- Ignore None field while Encoding to json with Circe for Scala
- Multiple constructors with the same number of parameters exception while transforming data in spark using scala
- how to read json with an optional field serialized by a missing field, in upickle
- Mapping json to case class with Spark (spaces in the field name)
- Scala: remove field from json files in the nested object with defined name
- Playframework - JSON parsing object with single field - definition issue
- Importing schema from json with optional value
- JSON Schema Parsing to get column names as per the schema order
- Run time exception while trying to json serialize a 'val' or 'lazy val' Tuple field in grand parent Trait, in scala using json4s
- how to change json field value in the json level with Play json?
- scala json4s create json with optional/nullable field throws exception
- While building SBT compile, I need to run some class that would create the schema JSON from Case classes
- Parsing JSON based off of schema with recursive fields in Scala
- I am reading JSON data from kafka and parsing the data using spark. But I end up with JSON parser issue
- Parsing a large (30MB) JSON file with net.liftweb.json or scala.util.parsing.json gives OutOfMemoryException. Any recommendations?
- Lift-json extract json with 'type' field into a case class
- How to create a Scala class with private field with public getter, and primary constructor taking a parameter of the same name
- Capturing unused fields while decoding a JSON object with circe
- How to handle optional fields in JSON parsing in play 2.1
- Custom Json Writes with combinators - not all the fields of the case class are needed
- How to drop malformed rows while reading csv with schema Spark?
- Scala spark: how to use dataset for a case class with the schema has snake_case?
- how to read json with schema in spark dataframes/spark sql
- Playframework JSON parsing - Null pointer exception - when array is present
- Scala allowing call to java.util.HashMap get method with the wrong number of parameters
- Play: How to remove the fields without value from JSON and create a new JSON with them
- Play 2.2 JSON Reads with combinators: how to deal with nested optional objects?
More Query from same tag
- sbt get real compiler options in custom sbt plugin
- How to do a bar chart to org.apache.spark.sql.DataFrame?
- Scala implicit classes as function arguments
- Proper way to use Numeric type bounds for classes in scala
- HBase Table Retrieval Data
- How to compare the distance of every row from each other using lat long in Spark-scala
- Equivalent Scala code of Java code not working (method signature not valid)
- When I run tests with scalaTestPlus and Play framework, I get an error "no started application"
- What's the purpose of abstraction?
- Avoiding MatchError when matching over subset of possible values
- Akka system config in library affecting app
- How to provide the output result of a function as a PUT request in scala?
- Spark Scala: Count Consecutive Months
- Monad to catch multiple exceptions (not just fail on single)
- Scala for-yield with multiple conditions
- Modelling shared error states for use in explicit error declarations via Eithers
- How do I declare a variable in play template and access it in controller?
- Add specific scala file to the build.sbt not all the directory it is in
- Why my Spark Streaming program processes so slow?
- Spark DataFrame zipWithIndex
- Filter function in Scala not working
- Appending a slice to the end scala
- Date diff of ISO 8601 timestemp strings in Spark
- How can I create a different entity type using the original entity id in sorm?
- How to serialize and deserialize traits, to and from Json, in Scala?
- How to catch empty lines in user input for Scala?
- Scala Option[Enum] matching is not working as expected
- can the keyword "foreach" get the array parameter?
- How to save output of repeat loop in parquet file?
- Execute curl command in scala to get shipping charges from easypost API