score:2
first, what the error tells you is that you shouldn't have a case class
for a class which is extended. the thing it says about extractors is basically that, should the need arise to pattern match on a parent class (which is what the case
of case class
comes from, easy pattern matching), you can always write custom extractors for it.
however, this does not apply in your case, since you cannot change the fact that your lib exposes case class
es.
the other solution, is to consider that there is no inheritance between your two classes. indeed, one is the representation of a user, while the other is the representation of a storage unit containing a user. so it looks like your proposed solution has some meaning after all.
still, it might seem cumbersome to have to add a field, both in the serialized object (in mongo) and in the jvm class, where you will need to always do mongouser.user
to get a lib.myuser
from a myuser
...
one elegant (imho) solution is to keep this field user: lib.myuser
, but to have your serialization flatten this field to the top level of the bson object, so that a serialized myuser
will look like
{
_id: objectid(_),
id: 123456789abc-1234-1234-1234,
...
}
as if you had inherited the fields of lib.myuser
.
deserialization
now, whenever you want to get a lib.myuser
from mongo, for read purposes only, you can deserialize it directly in this format, ignoring the added fields. if you need to do some updates on it however, you will have to deserialize it as a myuser
, to be able to update this particular bson document.
scala usage
when you have deserialized the object as a myuser
(say, for an update), you might still want an easy access to all fields exposed in lib.myuser
, without having to access the field user
every time. this can be done with an implicit conversion.
generalization
by the way, you can do this for any object that you want serialized, in a generic way...
to sum it all up
case class mongorepr[t](_id: option[bsonobjectid]
value: t,
created: option[datetime],
updated: option[datetime])
object mongorepr {
//implement flattened (de)serialization, this depends on your
// mongo driver and what you're using for serialization (json/bson)
def handler[t](implicit handlert: bsonhandler[t]): bsonhandler[mongorepr[t]] = ???
implicit def tovalue[t](repr: mongorepr[t]): t = repr.value
}
Source: stackoverflow.com
Related Query
- Case class inheritance is prohibited, but how to model dependency from library?
- How to access a repository from a case class in dependency injection
- Scala 2.10 reflection, how do I extract the field values from a case class, i.e. field list from case class
- If case class inheritance is prohibited, how to represent this?
- How To Access access Case class field Value from String name of the field
- How to fill case class from json with partial data?
- How to create Dataset (not DataFrame) without using case class but using StructType?
- How to apply toJson to instance of a case class using spray-json library
- How instantiate case class from Map[String, String]
- How to discover the library that defines a class from sbt?
- How to send Json from client with missing fields for its corresponding Case Class after using Json.format function
- scala - how to access a case class method from another class?
- How to get index from a Model Class iteration in a Playframework Scala Template
- How to solve Illegal inheritance from final class in Scala
- In ScalaPb, How to create a case object and case class extending from the same trait?
- How to convert JSON into Scala class (NOT Case class), and then populate set of case classes from that big class
- How to serialize an akka "actorRef" that is part of a case class using JsonFormat custom serializer from spray-json?
- How to "flatten" JSON representation of objects using circe i.e., from case class to its string representation?
- How to specify default value when extracting Option from Option case class
- How to create case class from List[String]?
- How to access fields of case class from JSP?
- How to process Scala case class objects from Kafka using streaming queries?
- How to extract my case class from Json using Json4s?
- How to make a parquet file from scala case class using AvroParquetWriter?
- MongoDB Scala driver. How to update only changed document's fields from the case class object
- How to init case class from one JSON string instead of separate fields of form
- How to ignore a key from json while mapping in case class
- case class inheritance from trait behaving unexpectedly
- How to set default value to 'null' in Dataset parsed from RDD[String] applying Case Class as schema
- How to create a case class from List
More Query from same tag
- Passing fields to Scala subclass constructors that have Java parents without similar constructors
- Parse JSON with Interface/trait and Different implementations
- Make Akka spawn a process for the remote node
- Scala curly braces verbosity?
- Why doesn't shapeless.Nat have a method to access its value at runtime?
- How to write a global object for table creation using Slick MultiDBCake principle?
- How to apply tuple to a format string in Scala?
- NullPointerException when writing parquet
- How to calculate time difference with nscala_time in Spark
- Redirect http request to https (spray io/scala)
- Custom column (Object) type in Slick
- Accessing variables in Case Classes
- Sending a message to all actors within an ActorSystem
- Scala Reduce() operation on a RDD[Array[Int]]
- Akka-streams time based grouping
- Scalatest, cannot call invokePrivate
- Get rid of mutable local variable in Scala
- Method type returning an object using this.type as a type parameter
- Duplicated Spark Context with IntelliJ in Worksheet
- Why can't I use combineByKey in Spark?
- Trying to get a list with the Typesafe config library
- map foldLeft type mismatch found Int required String
- how to compare to previous values in Seq[Double]
- TransmogrifAI : FAILURE: Build failed with an exception
- Scala Class Loader for Avro Tools Run Method Written in Java
- Scala `.sum` performance when doing a roundtrip
- In spark streaming job, how to collect error messages from executors to drivers and log these at the end of each streaming batch?
- How to read a stream in a simple way in scala?
- Why can't I match on a Stream?
- How to do Multiple column count in SPARK/SCALA efficiently?