score:5
Accepted answer
well i guess what you meant
filter keys of map that does not contains any value of the set
m.filterkeys(key => s.exists(key.contains(_)) )
this will do
score:1
set extends function1
collect in action
m.collect { case (k, v) if s(k.split("_")(1)) => k -> v }
filterkeys in action
m.filterkeys(key => s(key.split("_")(1)))
filter in action
m.filter { case (k, _) => s(k.split("_")(1)) }
explanation
set
extends function1
and set instance can be directly applied to the a key to check if it exists in the set.
scala> val s = set("blue", "orange")
s: scala.collection.immutable.set[string] = set(blue, orange)
scala> s("blue")
res0: boolean = true
scala> s("apple")
res1: boolean = false
scala> val s = set("blue", "orange")
s: scala.collection.immutable.set[string] = set(blue, orange)
scala> val m = map("product_orange_123" -> 1, "prodoct_blue_123" -> 2, "product_green_123" -> 5, "product_blue_887" -> 7)
m: scala.collection.immutable.map[string,int] = map(product_orange_123 -> 1, prodoct_blue_123 -> 2, product_green_123 -> 5, product_blue_887 -> 7)
scala> m.collect { case (k, v) if s(k.split("_")(1)) => k -> v }
res2: scala.collection.immutable.map[string,int] = map(product_orange_123 -> 1, prodoct_blue_123 -> 2, product_blue_887 -> 7)
scala> m.filterkeys(key => s(key.split("_")(1)))
res3: scala.collection.immutable.map[string,int] = map(product_orange_123 -> 1, prodoct_blue_123 -> 2, product_blue_887 -> 7)
scala> m.filter { case (k, _) => s(k.split("_")(1)) }
res4: scala.collection.immutable.map[string,int] = map(product_orange_123 -> 1, prodoct_blue_123 -> 2, product_blue_887 -> 7)
Source: stackoverflow.com
Related Query
- Remove keys that don't contain any of the values in a set
- Compare scala map values to a list and return default value for keys that don't exist in the list
- How to remove entries that contain empty values in an RDD that contains csv data?
- Count the number of keys for each value of a set tagged to that key
- Compare 2 dataframes and create an output dataframe containing the name of the columns that contain differences and their values
- Is there any way to generate a UUID in Java that is identical to that of the one generated in C#?
- Pick out the Nth element of a HList of Lists and return that value as a HList of values
- How to remove keys with null values from Argonaut Json objects
- Spark (scala) dataframes - Check whether strings in column contain any items from a set
- Is there any method that does the same thing as map() but generates a different type of container?
- In scala, how to get an array of keys and values from map, with the correct order (i-th key is for the i-th value)?
- Is it possible to show the values for all keys in sbt?
- How to get the set of rows which contains null values from dataframe in scala using filter
- How to compare that two Traversables contain exactly the same elements?
- How to create a scala function that returns the number of even values in an array?
- How do I create a function that takes any Seq[T,Int] and returns the original type of the Seq.
- Scala - Spark : Get the column names of the columns that contains null values
- How do I transform a List[String] to a List[Map[String,String]] given that the list of string represents the keys to the map in Scala?
- Spark: How to join two `Dataset`s A and B with the condition that an ID array column of A does NOT contain the ID column of B?
- In Shapeless, given two records, how do I require that both records have the same keys and join them?
- Remove DataFrame rows that doesn't contain string from another DataFrame
- How do I write a path matcher that matches any part of the remaining path with a regex?
- How do I explode multiple columns of arrays in a Spark Scala dataframe when the columns contain arrays that line up with one another?
- Is there any language construct in scala that can assure all cases of the pattern matching have been handled?
- Scala getting a collection back when the compiler says that it has type Any
- Retrieving values from a map for a filtered set of keys
- String ending with character in parboiled2, when the string can contain that character
- Is there any stable method on the SparkSession/SparkContext/RDD that we can call to easily detect when eviction is happening?
- Traverse Set elements and get values using them as keys from a hashmap using Scala
- In scala or scalajs Diode, do any of the existing types line up with the use case of "updating a model that doesn't yet exist"?
More Query from same tag
- What scala statements or code can produce a byte-code which can not be translated to java?
- Scala collection of dates and group by week
- Scala: how to destructure a basic map entry
- Replace class implementation with Scala Macro
- Type mismatch for expectMsg, required Int
- Structured Type to match Class Constructor
- Is there a Pattern in Scala that add a method to an Array object?
- how to switch two letters in a string?
- Scala underscore minimal function
- How to add WebJars to my Play app?
- Spark find previous value on each iteration of RDD
- How do I create a periodic Poller with Play 2.0 and Scala
- Scala collect function
- Scala circe decode Map[String, String] type
- How to add a sort condition to a Spark Dataframe
- Scala optionally supply named default parameter
- Per-Document Word Count in Spark
- Splitting or Breakup multidimensional arrays in scala spark along attributes
- scheduleAtFixedRate alternative in Scheduler (since ActorPing is deprecated)?
- What are the benefits of using the scala akka toolkit vs java akka toolkit version?
- Cassandra Spark Connector and filtering data
- What is the best way to run a Java program that also has Scala code?
- Scala: `ambigious implicit values` but the right value is not event found
- Scala Macro Annotations - Why doesn't my class appear to be updated when I use it as a type parameter?
- Finding class using object
- Scala return implicit in the context
- UnsatisfiedLinkError: How to load native library of opencv when running play application?
- list Pattern Matching to return a new list of every other element
- Scala - Combine Substrings from a Vector using Reduce
- Scala dataframe get last 6 months latest data