score:1
Accepted answer
in the absence of any code, i'll provide a simple version but give the caveat that this is an inaccurate way of measuring and not great for things that take a small amount of time. only really useful if the operation your measuring actually takes a decent amount of time so most of the time is spent in the future and not in the test code.
import scala.concurrent._
import scala.concurrent.duration._
import executioncontext.implicits.global
val mylist = (1 to 10)
val start = system.currenttimemillis
val seqfutures = for (elem <- mylist) yield {
future {
thread.sleep(5000) // blocking operation
}
}
val result = future.sequence(seqfutures)
await.result(result, 30.seconds)
val end = system.currenttimemillis
println(s"${end-start} seconds")
if you are going to have a large number of blocking io calls, it can be worth configuring a separate execution context that has a larger thread pool just for the blocking io to avoid the blocking io consuming all your threads.
Source: stackoverflow.com
Related Query
- Know when all jobs are completed in ExecutionContext.Implicits.global
- performing a task when all futures are completed in scala
- Spark job restarted after showing all jobs completed and then fails (TimeoutException: Futures timed out after [300 seconds])
- How to work with all the monadic structures when they are nested deep?
- Why are all fields null when querying with schema?
- How to not apply a function when all parameters are given
- Scala: completing a Future when other Futures are completed
- Ask Akka actor for a result only when all the messages are processed
- How to null a struct in Scala spark when all values in the struct are null?
- Scala determine if global ExecutionContext finished all tasks
- scala condition when all param option are met
- Is it possible to groupBy a Spark's dataframe when not all values are present in column?
- ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. when running Spark in CRON
- Scala: property based testing: how to know all necssary test cases when writing test
- Scala variable with empty string returning true when checking if all are digits
- What are all the uses of an underscore in Scala?
- What are the relationships between Any, AnyVal, AnyRef, Object and how do they map when used in Java code?
- What are the precise rules for when you can omit parenthesis, dots, braces, = (functions), etc.?
- When are higher kinded types useful?
- What are all the instances of syntactic sugar in Scala?
- What are views for collections and when would you want to use them?
- Since Spark 2.3, the queries from raw JSON/CSV files are disallowed when the referenced columns only include the internal corrupt record column
- What are possible reasons for receiving TimeoutException: Futures timed out after [n seconds] when working with Spark
- SCALA: Which data structures are optimal in which situations when using ".contains()" or ".exists()"?
- How many implicits are there in Scala?
- Find all implicits
- When are cache and persist executed (since they don't seem like actions)?
- When are scala's for-comprehensions lazy?
- Scala: when exactly are function parameter types required?
- Why are integration tests in a Play/Scala project not executed when using "sbt it:test"?
More Query from same tag
- Maven test doesn't run tests
- Extract type, key and value from Rapture JSON object
- Scala: Calling function dynamically
- how to jsonize a custom object with spray?
- Understanding a Scala function from Play framework zentask example
- Why to use Applicative
- How to get None Option from salatDAO when mongodb gives error
- Cannot prove that "a Tuple2" <:< (T, U)
- Scala Function founds Unit while i return boolean
- How to overwrite NaN or Infinity result when performing arithmetic on Double?
- Scala-ARM for class members?
- Scala Spark - Reduce RDD by adding multiple values per key
- Scala: Generic class with multiple constructors
- Creating a Stream from High to Low Range
- How to use SQL to query a csv file with Scala?
- Reading gzipped XML in scala
- how to check all the records in column having bigint type or not in scala, while writing unit test cases?
- Using a view bounds with scalaz
- Scala Slick sum()
- How to set multiple settings in sbt console?
- Using Lift-Json with Case Classes
- How to make binary classication in Spark ML without StringIndexer
- Making work play2-auth standard sample
- Regex in Java is not valid (backspace char in brackets)
- Modifying the default JSESSIONID cookie, Scala Lift Framework
- How does implicit conversion work in Scala?
- stuck with futures in scala
- scala string substitution of templates
- How does MapReduce recover from errors if failure happens in an intermediate stage
- Scala trait and class in the same file