The most concise way I can think of is to flip your mental model and put indices first:
indices map A
And, I would potentially suggest using
lift to return an
indices map A.lift
You can use
indices, which maps each element to a new element based on a mapping lambda. Note that on
Array, you get an element at an index with the
indices.map(index => A.apply(index))
You can leave off
indices.map(index => A(index))
You can also use the underscore syntax:
When you're in a situation like this, you can even leave off the underscore:
And you can use the alternate space syntax:
indices map A
You were trying to use
foreach, which returns
Unit, and is only used for side effects. For example:
indices.foreach(index => println(A(index))) indices.map(A).foreach(println) indices map A foreach println
- How can I select a non-sequential subset elements from an array using Scala and Spark?
- How to select multiple column from database using scala Play2.6 and with proper json response header
- How can I avoid mutable variables in Scala when using ZipInputStreams and ZipOutpuStreams?
- How can I fix missing conf files when using shadowJar and Scala dependencies?
- How can I idiomatically "remove" a single element from a list in Scala and close the gap?
- How to create a Row from a List or Array in Spark using Scala
- How to parse and extract information from json array using json4s
- How to declare scala method so that it can be called from Java using varargs style
- How can I select a stable subset of rows from a Spark DataFrame?
- How can I deserialize from JSON with Scala using *non-case* classes?
- How to develop and run spark scala from vscode using sbt or Metals
- How to select a subset of fields from an array column in Spark?
- How to go from a List[F[String]] to F[List[String]] using Scala and Cats?
- Using IntelliJ, how can i determine whether particular function stems from Java or Scala
- How can I pass a Scala UserDefinedFunction where output is a complex type (using StructType and StructField) to be used from Pyspark
- How can I shield implementation level classes, traits and objects from Scala public APIs?
- What's the best way using Scala and SBT to create samples for a library that can be built from the SBT CLI?
- How to fetch records from the database using Play with Scala and Slick
- Using scala and lift, how can I do efficient filtering operations on documents in a collection of mongodb?
- How to read and write from MongoDb using two different drivers (MongoDB Scala Driver and Salat)
- How can I use memcached (Amazon elasticache) from both scala and rails simultaneously?
- Adding elements to JSON array using circe and scala
- Using play framework and scala how to display image path from database
- Creating and array from a txt file using spark shell, scala in particular
- How to skip first and last line from a dat file and make it to dataframe using scala in databricks
- Scala how to match two dfs if mathes then update the key in first df and select all columns from required df
- How do I create a class hierarchy of typed factory method constructors and access them from Scala using abstract types?
- How to join two dataframes in Scala and select on few columns from the dataframes by their index?
- how can I write a function in scala called Order that takes one arguement: a list of Ints. And returns the same List of Ints from least to greatest
- Traverse Set elements and get values using them as keys from a hashmap using Scala
More Query from same tag
- How to get the coefficients of the best logistic regression in a spark-ml CrossValidatorModel?
- Save Dataset elements to files with specified file path
- Map or match a Scalaz.EitherT
- Spark: coalesce very slow even the output data is very small
- Grabbing a simple class name for a class defined in the interpreter
- prevent scala function composition from inferring ? type
- Get values of all variables in a case class without using reflection
- Dotty cannot infer result type of generic Scala function taking type parameter trait with abstract type
- How do you parallelize RDD / DataFrame creation in Spark?
- How well do Java and Scala work together?
- Parsing json list from Gatling response body
- Using Any data type in Scala spark
- How to pass Java List to Scala, filter it in Scala and get back to Java?
- Actor good practice?
- gradle kotlin DSL scala & idea
- How to cut words that match from collection of sentences?
- How to run a Spark test from IntelliJ (or other IDE)
- Inheritance of same name method from difference traits
- Create a list with empty map
- How to set different scalacOptions per Scala version when cross-compiling using Build.scala?
- DataFrame Initialization with null values
- How to add a column to the existing DataFrame and using window function to add specific rows in the new column using Scala/Spark 2.2
- Call Xinitthreads from Java. Gnuplot error
- Get differences between members of a list
- Curious error springmvc annotations using scala
- Inference of parameter type of function member of generic type in Scala?
- How would you time out multiple async requests with Scala's async?
- `forever` Combinator with Higher Kinded Type
- Parboiled - how to parse a real number?
- how does scala expression results in string consisting elements of array