score:2
Accepted answer
this occurs because play json's writes
& reads
macros share a common implementation. within this implementation the type a
provided to json.writes[a]
is inspected for it's apply method, which in turn is used to generate the apply for the writes
instance.
as an alternative to using the 'writes' macro, you can create just roll your own write[citysuggestion]
like this:
import play.api.libs.json._
import play.api.libs.functional.syntax._
implicit val citysuggestionwrites: writes[citysuggestion] = (
(jspath \ "name").write[string] and
(jspath \ "locationid").write[string] and
(jspath \ "locationkind").write[string]
)(unlift(citysuggestion.unapply))
Source: stackoverflow.com
Related Query
- Why does Scala's require method in Predef allow a String as argument?
- Why does Play refuse form reporting "Cannot resolve method apply with such signature: Missing arguments"?
- Why does Spark dataset foreach method require explicit function conversion, but map does not?
- Why does apply method requires parens?
- Why won't the Scala compiler apply tail call optimization unless a method is final?
- Why does Scala apply thunks automatically, sometimes?
- Why does Scala language require you initialize a instance variable instead of relying on a default value?
- Why does Scala require a return type for recursive functions?
- Why does this explicit call of a Scala method allow it to be implicitly resolved?
- Why does Iterator have a contains method but Iterable does not, in Scala 2.8?
- scala: why does underscore (_) initialization work for fields but not method variables?
- Why does Option require explicit toList inside for loops?
- Why method overloading does not work inside another method?
- Why does Scala require partial application of curried functions when assigning to a val?
- Why does Slick require using three equal signs (===) for comparison?
- Why does Scala evaluate the argument for a call-by-name parameter if the method is infix and right-associative?
- Why does the Spark DataFrame conversion to RDD require a full re-mapping?
- Why does scala fail to compile when method is overloaded in a seemingly unrelated way?
- Why does the Option's orNull method have this superfluous implicit argument?
- Why does a method parameter cause NotSerializableException with Mockito?
- Why does scalac not believe that a method does not fit the required type signature?
- Why does Slick generate a subquery when take() method is called
- Why does the "contains" method on "Option" use a second type with lower bound instead of an "Any" type?
- Why does Future.onSuccess require a partial function
- Why does Scala implicit resolution fail for overloaded method with type parameter?
- Why does BitSet require an explicit cast to be considered as an instance of Set[Int]?
- Scala: Why does zipped method succeed with tuples of List and not Traversable?
- Why I get "Application does not take parameters" using JSON Read with Play framework 2.3?
- Why does the code to model UNTIL/REPEAT require new keyword?
- Why does Scala starts from an object's main method instead of a class's static main method?
More Query from same tag
- spark-streaming-kafka-10 DSteam is not pulling anything from Kafka
- Reactivemongo leaking connections during Play hot-reload
- How to add remarks column using scala
- When can I unpersist my RDD?
- Scala find missing values in a range
- What is the scala way of extracting value from Option[String]?
- collapsing \/[A,A] to A
- How to stop all scheudlers all at time in java play
- Error running spark in a Scala REPL - access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )
- How to parse set of properties in unspecified order?
- How to extract the ElementType of an Array as an instance of StructType
- NodeSeqMarshaller with ContentType charset not resolving
- How to remove elements from an array Column in Spark?
- Spark: Mapping elements of an RDD using other elements from the same RDD
- Scala class of out-of-class method
- Spray-Json - How to define field constraints
- Scala suffix notation is compiling
- Spark Streaming from Kafka topic throws offset out of range with no option to restart the stream
- example to use configMap in Scalatest
- Why are all fields null when querying with schema?
- Find minimum for a timestamp through Spark groupBy dataframe
- How can I directly test the methods of MyUserService for securesocial plugin for Play Framework
- spark scala : jnibasedunixgroupsmapping: error getting groups for yyy: the user name could not be found
- Handling dead actors in Akka Typed
- What is scala.mobile supposed to accomplish?
- Creating IO bundles using dynamic parameters in chisel using MixedVec
- UDF to generate JSON string behaving inconsistently
- akka issue Substream Source(EntitySource) cannot be materialized more than once
- Is Either.right = Right and Either.Left=Left?
- How do I access dataframe column value within udf via scala