score:4
a
is in covariant position for both contains
and filter
.
the signature for contains
is actually:
def contains[a1 >: a](elem: a1): boolean
when a type variable is used on the right-side of a lower type bound, it is in covariant position.
with filter
, the parameter is of type a => boolean
, i.e. function1[a, boolean]
. function1
is contravariant in its first parameter, and the function1
itself is in contravariant position as the parameter to filter
. two contravariances combine to make covariance.
one way to get your head around this is image a list[x]
where x <: y
. if this list[x]
is cast to list[y]
, are those methods still type-safe? contains
now requires that a1
be a supertype of y
, which is actually a more stringent requirement. filter
now requires a function y => boolean
, which is also a more stringent requirement because the passed in function must be able to handle any y
, not just those that are instances of x
. on the other hand, casting a list[y]
to a list[x]
would not be type-safe. if x
also subtypes z
but y
does not, and z
is used for a1
in contains
, then the type bound will be violated, because z >: y
isn't true. for filter, the passed function would be x => boolean
, and couldn't safely be passed the instances of y
that the list
contains. so, we can conclude that a
is indeed in covariant and not contravariant position in these methods.
score:1
this is allowed because for covariant types, the methods that take arguments can have a lower bound of t. consider the following example:
trait mylist[+t] {
def prepend(elem: t): list[t] = new cons(elem, this)
}
this won't work because you can't have a covariant type as an argument - you will get a compile time error. on the other hand, you can restrict the method parameter with a lower bound of t - the argument can be t or a supertype of t, instead of a subtype, like so
trait mylist[+t] {
def prepend[s >: t](elem: s): list[s] = new cons(elem, this)
}
Source: stackoverflow.com
Related Query
- How do Scala collections use some variant types in contravariant position?
- How to use mutable collections in Scala
- How does Scala use explicit types when resolving implicits?
- How to use path-dependent types with type classes in Scala
- How to use nested generic types in Scala as method return type
- How to use scala types Either or Option properly
- How to use own types instead of scala standard types in sbt?
- What's the standard way to work with dates and times in Scala? Should I use Java types or there are native Scala alternatives?
- How to use third party libraries with Scala REPL?
- Scala type keyword: how best to use it across multiple classes
- How are Scala collections able to return the correct collection type from a map operation?
- How to use Scala in IntelliJ IDEA (or: why is it so difficult to get a working IDE for Scala)?
- How can I use a Scala singleton object in Java?
- How to use MySQL JDBC driver in an SBT Scala project?
- How to use IntelliJ with Play Framework and Scala
- How to use regex to include/exclude some input files in sc.textFile?
- How to use scala trait with `self` reference?
- How are Scala 2.9 parallel collections working behind the scenes?
- How to use a Scala class inside Pyspark
- Any way to use some Scala for iOS coding?
- How to Use both Scala and Python in a same Spark project?
- How to groupBy using multiple columns in scala collections
- How do I tell sbt or scala-cli to use a nightly build of Scala 2.12 or 2.13?
- How to use Scala varargs from Java code
- How to effectively use Scala in a Spring MVC project?
- how to use asInstanceOf properly in Scala
- How scala generic constraints to nullable types work
- How to define and use custom annotations in Scala
- How can I use Gson in Scala to serialize a List?
- Scala How to use extends with an anonymous class
More Query from same tag
- If-else expression in slick run, update method scala function
- How to add module to Play! Framework 2.6 with custom application loader
- Dataframes to EdgeRDD (GraphX) using Scala api to Spark
- Passing dependent objects to a parent constructor in Scala
- Scala + pattern matching + String autoboxing
- No implicit format for MyClass available using Json.format
- bind a JSON value to a Scala class using play
- dataframe: how to groupBy/count then filter on count in Scala
- Source.combine doesn't take varargs?
- Scala Spark type missmatch found Unit, required rdd.RDD
- Implicit conversion to Seq[T] from Array[T]
- Convert Try to Future and recoverWith as Future
- How to access Play Framework 2.4 guice Injector in application?
- How to add all the values of a map without using recurrsion or var
- Spark repartiton, sortWithinParitions and then partitionBy during write is messing with my sort
- regular shaped tree fold left scala implementation
- Scala Slick Lifted Date GroupBy
- Apache Spark Count by Group Method
- How to create multiples columns from a MapType columns efficiently (without foldleft)
- Scala Child Class Constructor is not being called from Java Parent Class
- Using wildcard to open multiple csv files Spark Scala
- Spark streaming group by custom function
- SparkSQL groupby to create nested records
- Discrete state and animated transitions
- spark parquet - How to load LZO compressed parquet file
- Why eta-expansion doesn't work with implicitly added members
- Does gatling reported response times include the response time from a hosted redis feeder?
- Odd behavior with using transform function to postprocess AST in order to rename json field name
- Move files between hdfs directories as aprt of spark scala application
- how to set type for jsonb[] for play-scala anorm pgsql