score:0
Here is the code i have tried and it worked:
val upper=list.map(x=>x.split(":")).map(x=>x.map(x=>x.split(";")))
which gives the output:
upper: List[Array[Array[String]]] = List(Array(Array(a, bc), Array(de, f)), Array(Array(uvw), Array(xy, z)), Array(Array(123), Array(456)))
score:2
Using list.map(x=>x.split(":"))
will give you a list of Array.
upper: List[Array[String]] = List(Array(a;bc, de;f), Array(uvw, xy;z), Array(123, 456))
Mapping afterwards, you can see that the item will be an array where you are trying to run split on.
You might useflatMap
instead which will first give you List(a;bc, de;f, uvw, xy;z, 123, 456)
and then you can use map on those items splitting on ;
val upper = list.flatMap(_.split(":")).map(_.split(";"))
Output
upper: List[Array[String]] = List(Array(a, bc), Array(de, f), Array(uvw), Array(xy, z), Array(123), Array(456))
score:2
You can use split
with multiple delimiters in one map iteration :
val upper = list.map(x => x.split("[:;]"))
//upper: List[Array[String]] = List(Array(a, bc, de, f), Array(uvw, xy, z), Array(123, 456))
Source: stackoverflow.com
Related Query
- How to iterate over array elements in scala
- Spark Scala - How do I iterate rows in dataframe, and add calculated values as new columns of the data frame
- How do I iterate over the routes in a playframework 2 scala application?
- How to iterate over large Cassandra table in small chunks in Spark
- How to iterate over a huge amount of records with scala sorm
- how to iterate over array[string] in spark scala?
- How to Iterate each column in a Dataframe in Spark Scala
- Spark: how to iterate over spark arrayType as an expression
- Scala Spark - how to iterate fields in a Dataframe
- How to iterate over a collection Future in scala and modify an existing record/variable?
- How do I iterate over a Scala ArrayBuffer while some elements are removed?
- how to iterate over list of lists in scala
- How to iterate over objects (including nested objects) in Scala
- How to call withColumn function dynamically over dataframe in spark scala
- Scala - How to iterate over tuples on RDD?
- How to loop over array and concat elements into one print statement or variable Spark Scala
- Spark Scala - Need to iterate over column in dataframe
- How to submit a Spark scala job over Yarn, Hadoop
- How to iterate through JSON Object in scala spark
- How to iterate many Hive scripts over spark
- spark scala typesafe config safe iterate over value of a specific column name
- Iterate over spark cogroup() pairrdd output in scala
- How to iterate over hadoop MapWritable in Spark & Elasticsearch
- Iterate through the XML based on count and create ArrayString using spark scala
- Iterate over map with index and original order in Scala template in PlayFramework2. How to pick up the one after currently iterated?
- How to iterate through rows after group by in spark scala dataframe?
- How to iterate scala map?
- What's the new way to iterate over a Java Map in Scala 2.8.0?
- How to setup Intellij 14 Scala Worksheet to run Spark
- How to Use both Scala and Python in a same Spark project?
More Query from same tag
- How to parse jsonfile with spark
- Single documentation for mixed (Scala/Java) project?
- What is the cost of creating actors in Akka?
- What am I doing wrong? (With Scala's superclass parameters)
- Proving equivalence of nested path dependent types members
- dynamical creating a new instance of a case class in scala
- Unable to print LoggerContext in Play/Scala
- Infinite bogus pages in outpout docx using Apache Poi
- How to get typeArgs from a type in M7?
- Accessing message object instances in Akka Scala
- how to change json field value in the json level with Play json?
- toList on shapeless HList fails when resulting existential type too complex
- Not able to read parquet files in spark : java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods
- Iterate over java.util.Map with custom objects in Scala
- How to create custom ReactiveMongo Reader & Writer for object?
- Not able to create nested Object in Scala
- Type inference for collections of objects implementing classes with (self) type paramenters
- Checkbox lists in Play with Scala
- Why does UDF throw NotSerializableException in streaming queries?
- Function Overloading Mechanism
- Scala validation - Right-biased Either
- If statement within else statement in scala?
- Sort every column of a dataframe in spark scala
- How to convert bytes to int and compare with value
- Fold left to create a immutable list
- How to have multiple sub paths for an API with swagger annotations?
- Issue with Twitter streaming API
- Parse HTML meta tags in Scala
- How to do row insert conditionally, i.e. INSERT INTO with WHERE NOT EXISTS?
- Beginner Scala SBT query