score:12
Accepted answer
This exception is usually a consequence of actorSystem.shutdown
:
scala> val actorSystem: ActorSystem = ActorSystem("My-Actor-System")
actorSystem: akka.actor.ActorSystem = akka://My-Actor-System
scala> class Aaa extends Actor{ def receive = { case _ => }}
defined class Aaa
scala> actorSystem.actorOf(Props(new Aaa))
res178: akka.actor.ActorRef = Actor[akka://My-Actor-System/user/$a#-842131493]
scala> actorSystem.shutdown
scala> actorSystem.actorOf(Props(new Aaa))
java.lang.IllegalStateException: cannot create children while terminating or terminated
at akka.actor.dungeon.Children$class.makeChild(Children.scala:200)
at akka.actor.dungeon.Children$class.attachChild(Children.scala:40)
at akka.actor.ActorCell.attachChild(ActorCell.scala:369)
at akka.actor.ActorSystemImpl.actorOf(ActorSystem.scala:554)
... 43 elided
As you're sharing one instance of ActorSystem between many tests by default - it's possible that one of them did shutdown explicitly or implicitly (for example inside after
or even afterAll
). Also, you have created two actor systems with the same name here - one from (actorSystemConfig: ActorSystemConfig)
, second from - with ActorSystemConfig
(see PolicyDAOSystemTest
). First one is shared between all tests in application by default.
Source: stackoverflow.com
Related Query
- Why does Akka fail with "IllegalStateException: cannot create children while terminating or terminated" when testing with ScalaTest?
- Why does Akka application fail with Out-of-Memory Error while executing NLP task?
- Why does Scala compiler fail with "object SparkConf in package spark cannot be accessed in package org.apache.spark"?
- Why does join fail with "java.util.concurrent.TimeoutException: Futures timed out after [300 seconds]"?
- Why does Spark fail with java.lang.OutOfMemoryError: GC overhead limit exceeded?
- Why does sbt compile fail with StackOverflowError?
- Why does Spark fail with "Detected cartesian product for INNER join between logical plans"?
- Why does sbt build fail with "MissingRequirementError: object scala.runtime in compiler mirror not found."?
- Why does Spark application fail with “ClassNotFoundException: Failed to find data source: kafka” as uber-jar with sbt assembly?
- Why does 2.11.1 fail with error: not found: type Application?
- Why does Play action fail with "no suitable driver found" with Slick and PostgreSQL?
- Why does Scala compiler fail with "no ': _*' annotation allowed here" when Row does accept varargs?
- Why does using cache on streaming Datasets fail with "AnalysisException: Queries with streaming sources must be executed with writeStream.start()"?
- Why does sbt report "not found: value PlayScala" with Build.scala while build.sbt works?
- Why does the build fail with unresolved dependency: com.typesafe.sbt#sbt-native-packager;0.7.4?
- Why does my takeWhile fail to work with my Stream
- Why does sbt assembly fail with "Not a valid command: assembly"?
- In Scala, why does getInstance fail to work with GregorianCalendar?
- Why does Spark fail with "Failed to get broadcast_0_piece0 of broadcast_0" in local mode?
- Why does running tests through jenkins user on build slave fail with Missing scala-library.jar?
- Why does Scala implicit resolution fail for overloaded method with type parameter?
- Why does Spark Cassandra Connector fail with NoHostAvailableException?
- Why does sbt give multiple dependencies warning with Akka and ScalaTest dependencies?
- Why does from_json fail with "not found : value from_json"?
- Why does Spark with Play fail with "NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$"?
- Why does Spark application fail with "ClassNotFoundException: Failed to find data source: jdbc" as uber-jar with sbt assembly?
- Why does console in IDEA 10 fail with "tools is not a member of package scala"?
- Why does the Scala compiler fail with missing parameter type for filter with JavaSparkContext?
- Why does spark-shell fail to load a file with class with RDD imported?
- Why does spark-shell fail with "SymbolTable.exitingPhase...java.lang.NullPointerException"?
More Query from same tag
- Difference in asymptotic time of two variants of flatten
- Scala - Send Multiple Headers with scalaj.http.Http
- How to compare 2 columns and concatenate in Scala
- Scala: Collect values defined in hashmap passed by a list argument
- How to configure SingleThreadScheduledExecutor to run a job every 15mins between 8am to 10am and then every 1hr throughout the day
- SBT Plugin: Use inputTask to override settingKey before run taskKey?
- What do all of Scala's symbolic operators mean?
- How to split a list to multiple lists avoiding an imperative approach in scala
- Standard way to combine States in scalaz
- SBT 0.13.7 and Scala 2.11.5 version compatibility issues
- Java RDD vs Scala RDD
- How do I replace the nth occurrence of a special character, say, a pipe delimiter with another in Scala?
- How do exception messages in Thrift exceptions work?
- Using same type parameter as argument type and parameter type with match expression
- How to query data stored in Hive table using SparkSession of Spark2?
- java.lang.NoSuchMethodException: <Class>.<init>(java.lang.String) when copying custom Transformer
- Unable to build Scala Maven project
- What does UInt(0) mean?
- Is it possible to do variables with local scoping in c#?
- Importing Scala3 SDK in IntelliJ IDEA
- Intellij IDEA setup: cannot resolve symbol println
- Testing rx-observables from Futures/Iterables
- Gatling load testing and running scenarios
- json4s error: value \\ is not a member of (String, org.json4s.JsonAST.JObject)
- Filtering a list by removing unwanted values in Scala
- unable to convert scala map to json with play 2
- Not enough arguments for constructor although I use the Option[] Type
- sbt to exclude source directory
- How to create SparkSession from existing SparkContext
- List string int converting in scala