score:3
option
trait aka maybe monad
is tend to use with functions of type x => option[y]
. the for( <- ) yield
syntax sugar is replaced by compiler with corresponding flatmap
calls which is equivalent for monadic bind
operator.
so you are expected to use functions like x => m[y]
in flatmap
and for-expression with list
s\ option
s\ stream
s \ iterator
s etc.
read more here
also your first expression has type list[option[x]]
. to extract values properly you could use intermediate value in vladimir's answer.
like
def func1(in: string): option[string] = {
some(in)
}
def func2(in: string): option[string] = {
some(in)
}
def func3(in: string): option[string] = {
some(in)
}
val res = for {
x <- list(some("hello"), none, some("world"))
t <- x
y <- func1(t)
z <- func2(y)
w <- func3(z)
} yield w
print(res)
score:0
the error says exactly that - you do not need funcx
argument to be option[string]
. use just string
:
def func1(in: string): option[string] = {
some(in)
}
def func2(in: string): option[string] = {
some(in)
}
def func3(in: string): option[string] = {
some(in)
}
val res = for {
x <- list(some("hello"), none, some("world"))
y <- func1(x)
z <- func2(y)
w <- func3(z)
} yield w
print(res)
Source: stackoverflow.com
Related Query
- Avoid null check correctly
- How to check for null in a single statement in scala?
- How to check for null or false in Scala concisely?
- Pattern Matching to check if string is null or empty
- Scala/Lift check if date is correctly formatted
- Scala Dataframe null check for columns
- Scala avoid using null
- Check if arraytype column contains null
- SparkStreaming: avoid checkpointLocation check
- how shall I check that a Future is not null
- Scala Check optional string is null or empty
- Is ne a better way to check null than !=?
- Null check for Double/Int Value in Spark
- Avoid serialize to null using Circe json serializer
- How to avoid dropping null values from dropduplicate function when passing single column
- how to avoid spark NumberFormatException: null
- Inconsistent null equality check scala 2.11.7
- In DataFrame.withColumn, how can I check if the column's value is null as a condition for the second parameter?
- How to check if element is null or empty in a field constructor?
- Spark Scala : Check if string isn't null or empty
- check for null values in Option[Long]
- How to check whether multiple columns values of a row are not null and then add a true/false resulting column in Spark Scala
- Do I need to check for null in Loan Pattern
- Scala: Check for null for List
- Scala Monad - without for statement(elegant way to check null attribute)
- null check in while loop in Scala, unusual behaviour
- Scala - How to avoid java.lang.IllegalArgumentException when Row.get(i) would retrieve a null
- Cannot check and handle null value using Play and Scala
- Scala: check that a case class tree doesn't contain any null
- Scala: How to avoid check instance and casting
More Query from same tag
- Calculating the Moving Average of a List
- Scala | Lift | How to make across menu
- How to Convert Map's Iterator from Java to Scala using JavaConversions
- How to deactivate output buffering in Scala Akka HTTP Server
- Why does this Scala code not compile?
- What is the best way to create map in parallel with scala?
- How to check the schema of DataFrame?
- How to make IntelliJ IDEA use javac for Java and scalac for Scala?
- Calling method via reflection in Scala
- instantiating a Scala class using reflection Java's `newInstance`
- Scala - scan package/folder for implementations of a Trait
- Use of @BeanProperty for setter generation only
- Store about 120 Column based flat data structure to Spark Parquet Scala
- What kinds of functions are considered as "composable"?
- How does one completely avoid runtime reflection in Scala?
- how Value type works in enum?
- Read from s3 via spark from outside AWS using temp credentials
- Scala type arguments do not conform to method eval's type parameter bounds
- "Could not find main method from given launch configuration" when using Java+Scala+Slick2D
- In scala does passing a var to a "=> T" type work?
- Spark DataFrame to Dict - dictionary update sequence element error
- How do I join two rdds based on a common field?
- Why I could compare Strings using view bound but not upper bound?
- IntelliJ-Scala - how can I import Scala-SBT project from Github to directly into IntelliJ
- Scala Function Variance and Overriding
- How to store sequence of JsValue into a ArrayBuffer
- Smile - Model Persistence - How to write models to HDFS?
- How to specify different application.conf for specs2 tests?
- Spark/Scala flatten and flatMap is not working on DataFrame
- Spark jdbc batch processing not inserting all records