score:4
This will come as little consolation for anyone who's stuck with the older iteratee
API, but I recently verified that an equivalent test passes against the scalaz-stream API. This is a newer stream processing API that is intended to replace iteratee
.
For completeness, here's the test code:
// create a stream containing `n` arrays with `sz` Ints in each one
def streamArrs(sz: Int, n: Int): Process[Task, Array[Int]] =
(Process emit Array.fill(sz)(0)).repeat take n
(streamArrs(1 << 25, 1 << 14).zipWithIndex
pipe process1.chunk(4)
pipe process1.fold(0L) {
(c, vs) => c + vs.map(_._1.length.toLong).sum
}).runLast.run
This should work with any value for the n
parameter (provided you're willing to wait long enough) -- I tested with 2^14 32MiB arrays (i.e., a total of half a TiB of memory allocated over time).
Source: stackoverflow.com
Related Query
- Avoiding memory leaks with Scalaz 7 zipWithIndex/group enumeratees
- Avoiding Scala memory leaks - Scala constructors
- Layering State with Either in scalaz
- Converting a tuple of options to an option of tuple with Scalaz or Shapeless
- Is it possible to lazily traverse a recursive data-structure with O(1) memory usage, tail-call optimized?
- Collect rows as list with group by apache spark
- How to compose function to applicatives with scalaz
- How to use Scalaz 7's EitherT with liftM
- Heroku memory leak with Play2 scala
- Is Spark zipWithIndex safe with parallel implementation?
- Does development with scalaz require an Unicode/APL-like keyboard?
- Slick query with multiple joins, group by and having
- Reader Monad with Scalaz
- How do I encode this method with an implicit parameter group which contains a dependent type?
- Combining validations with scalaz 7
- Is manually managing memory with .unpersist() a good idea?
- Async computation with Validation in Scala using Scalaz
- SBT out of memory with subprojects
- Validation usage with |@| in Scalaz
- Scalaz Validation with applicative functor |@| not working
- Scala Iterable Memory Leaks
- How to I convert between monad stacks with transformers in scalaz 7
- Using a view bounds with scalaz
- Group List elements with a distance less than x
- Group values by a key with any Monoid
- How can I combine two scalaz streams with a predicate selector?
- Are methods instantiated with classes, consuming lots of memory (in Scala)?
- Akka http memory leak with web socket
- What methods are needed to create custom Applicative Functor to use with scalaz |@|
- Scalaz stream group sorted database results
More Query from same tag
- Instantiating Scala Type Members results in error
- Scala: constrain a class type parameter for a member function
- Reference is not fully initialized
- Keeping type info between methods in Scala
- Spark Scala - Identify the gap between dates across multiple rows
- double for loop in scala with incrementing inner index
- Passing function as block of code between curly braces
- Why are type checks and type casts considered poor style in Scala?
- Slick 3 use custom MappedColumnType in custom SourceCodeGenerator
- Idiomatic Scala solution for the Dutch national flag problem
- How to convert a generic HList to a List
- How Play Framework handle CloseEvent from WebSocket?
- Scala/Spark version compatibility
- Scala: "Too many arguments" Error in IntelliJ IDEA 11 using Scala 2.7.7
- Read file on remote machine in Apache Spark using ftp
- Scala Type Classes Best Practices
- Can Spark SQL push down a LIMIT operator on inner join?
- Not able to create zip folder in play framework
- Missing parameter type for expanded function when not assigned to a variable
- Spark Dataset and java.sql.Date
- sbt is installed but not found
- What's the purpose of these Java files in scala.runtime?
- ScalaTest and Maven: getting started
- Accessibility of scala constructor parameters
- Exception handling with default in for expressions
- Ignoring Scala scripts when using sbt compile server in IntelliJ IDEA
- Resolve two sbt imports with the same classes
- Scala : How to find types of values inside a scala nested collection
- How to check if required key include in a JSON spark Scala Dataframe
- Could not initialize class exception on scala(possibly squeryl bug)