score:3
if i understand the question correctly, then i can offer up a couple of ways you can accomplish this (though there are certainly others).
option 1
in this approach, there will be an actor that is responsible for waking up periodically and sending a request to all session actors to get their current stats. that actor will use actorselection
with a wildcard to accomplish that goal. a rough outline if the code for this approach is as follows:
case class sessionstats(foo:int, bar:int)
case object getsessionstats
class sessionactor extends actor{
def receive = {
case getsessionstats =>
println(s"${self.path} received a request to get stats")
sender ! sessionstats(1, 2)
}
}
case object gatherstats
class sessionstatsgatherer extends actor{
context.system.scheduler.schedule(5 seconds, 5 seconds, self, gatherstats)(context.dispatcher)
def receive = {
case gatherstats =>
println("waking up to gether stats")
val sel = context.system.actorselection("/user/session*")
sel ! getsessionstats
case sessionstats(f, b) =>
println(s"got session stats from ${sender.path}, values are $f and $b")
}
}
then you could test this code with the following:
val system = actorsystem("test")
system.actorof(props[sessionactor], "session-1")
system.actorof(props[sessionactor], "session-2")
system.actorof(props[sessionstatsgatherer])
thread.sleep(10000)
system.actorof(props[sessionactor], "session-3")
so with this approach, as long as we use a naming convention, we can use an actor selection with a wildcard to always find all of the session actors even though they are constantly coming (starting) and going (stopping).
option 2
a somewhat similar approach, but in this one, we use a centralized actor to spawn the session actors and act as a supervisor to them. this central actor also contains the logic to periodically poll for stats, but since it's the parent, it does not need an actorselection
and can instead just use its children
list. that would look like this:
case object spawnsession
class sessionsmanager extends actor{
context.system.scheduler.schedule(5 seconds, 5 seconds, self, gatherstats)(context.dispatcher)
var sessioncount = 1
def receive = {
case spawnsession =>
val session = context.actorof(props[sessionactor], s"session-$sessioncount")
println(s"spawned session: ${session.path}")
sessioncount += 1
sender ! session
case gatherstats =>
println("waking up to get session stats")
context.children foreach (_ ! getsessionstats)
case sessionstats(f, b) =>
println(s"got session stats from ${sender.path}, values are $f and $b")
}
}
and could be tested as follows:
val system = actorsystem("test")
val manager = system.actorof(props[sessionsmanager], "manager")
manager ! spawnsession
manager ! spawnsession
thread.sleep(10000)
manager ! spawnsession
now, these examples are extremely trivialized, but hopefully they paint a picture for how you could go about solving this issue with either actorselection
or a management/supervision dynamic. and a bonus is that ask
is not needed in either and also no blocking.
score:1
there have been many additional changes in this project, so my answer/comments have been delayed quite a bit :-/
first, the session stats gathering should not be periodical, but on request. my original idea was to "mis-use" the actor system as my map of all existing session actors, so that i would not need a supervisor actor knowing all sessions.
this goal has shown to be elusive - session actors depend on shared state, so the session creator must watch sessions anyways.
this makes option 2 the obvious answer here - the session creator has to watch all children anyways.
the most vexing hurdle with option 1 was "how to determine when all (current) answers are there" - i wanted the statistics request to take a snapshot of all currently existing actor names, query them, ignore failures (if a session dies before it can be queried, it can be ignored here) - the statistics request is only a debugging tool, i.e. something like a "best effort". the actor selection api tangled me up in a thicket of futures (i am a scala/akka newbie), so i gave up on this route.
option 2 is therefore better suited to my needs.
Source: stackoverflow.com
Related Query
- How can I gather state information from a set of actors using only the actorSystem?
- How can I set the id of an element from snippet using lift framework?
- How to get the set of rows which contains null values from dataframe in scala using filter
- How can I acqurie the JSON data from Kafka using SparkStreaming
- how can we get the result from various actors and then send the result from all the actors to one response in play 2.2.0?
- How can I set a standard value in a textarea in Play Framework 2.1 using the helper object?
- How can I detach from Mongo in Spark when using the Mongo Spark Connector?
- How can I get the name of an Akka actor from within the actor itself?
- How to download and save a file from the internet using Scala?
- How can I idiomatically "remove" a single element from a list in Scala and close the gap?
- Is it possible and how to have var that can only be set once?
- How to parse and extract information from json array using json4s
- How do I run the Spark decision tree with a categorical feature set using Scala?
- How can my Play 2 app respond to different "Accept" headers from the client?
- How can I load Avros in Spark using the schema on-board the Avro file(s)?
- How to set only the preferred width of Panel with flow layout?
- How can an implicit be unimported from the Scala repl?
- How to declare scala method so that it can be called from Java using varargs style
- Using Scala 2.10 reflection how can I list the values of Enumeration?
- How can I count the average from Spark RDD?
- How can I deserialize from JSON with Scala using *non-case* classes?
- How do I set the scala sdk using gradle in Idea module?
- How can I select a non-sequential subset elements from an array using Scala and Spark?
- How can I add scala actors to an existing program without interfering with the normal termination behavior?
- How can I get the max of an arbitrary property from a list in Scala?
- How can I keep -Xcheckinit from interfering with the deserialization of Scala objects?
- How to parse placeholders from text without throwing away your sword so you can fight off the marauders with a lampshade
- How can I find the statements in a Scala program from within a compiler plugin?
- How can I control the number of output files written from Spark DataFrame?
- How can I access the new javascript keyword from scala.js?
More Query from same tag
- StackOverflow on `forever` Combinator
- How time is represented in netcdf dataset?
- What am I doing wrong? (With Scala's superclass parameters)
- Restricting a yarn container to execute only one task at a time
- IntelliJ IDEA 2020.3 Scala plugin isn't working with new projects
- How to convert a date time string to long (UNIX Epoch Time) in Java 8 (Scala)
- How to build a Zscale core? (RISC-V, rocket-chip)
- HowTo: Custom Field in Lift-Record-Squeryl
- what is the best practice of implementing a factory pattern/method in scala?
- How to retrieve multiple rows from stored procedure with Scala?
- Scala - IntelliJ: Simple Module w/ Scala SDK vs SBT-based
- NullPointerException applying a function to spark RDD that works on non-RDD
- Who monitors mailbox of each Actors?
- Understanding how to convert recursive to a trampoline
- How to create akka-http cachedHostConnectionPool for https requests in scala?
- Use case for LinkedList
- Method overriding, inheritance, and objects
- Scala Play WS client making parallel REST calls dynamically
- State of XML support in Scala 2.9.x
- Connecting Presto from Spark Scala
- Scala type mismatch when composing traits
- How to convert JSON to scala shapeless.hlist?
- Why do immutable objects enable functional programming?
- Gatling convert Json array to Map
- Why does spark application fail with java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig even though the jar exists?
- Match String with List Items
- Scala: access "static" member of bounded generic type
- Apache Spark 3.1.2 can't read from S3 via documented spark-hadoop-cloud
- Dynamically creating dataframes in Spark Scala
- Scala Tail Recursion