I don't think standalone is the problem. You described creating only one pool, so I think your problem is that you need at least one more pool and assign each job to a different pool.
FAIR scheduling is done across pools, anything within the same pool will run in FIFO mode anyway.
This is based on the documentation here: https://spark.apache.org/docs/latest/job-scheduling.html#default-behavior-of-pools
- Is FAIR available for Spark Standalone cluster mode?
- Not able to find ./bin/spark-class for launching spark cluster on standalone mode
- Spark driver node and worker node for a Spark application in Standalone cluster
- Spark Standalone cluster mode need jar in all executors
- memory error in standalone spark cluster as "shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[Remote]"
- Writing files to local system with Spark in Cluster mode
- Error when Spark 2.2.0 standalone mode write Dataframe to local single-node Kafka
- How to run two spark jobs in parallel in standalone mode
- How to run spark interactively in cluster mode
- Spark streaming is not working in Standalone cluster deployed in VM
- noclassdeffounderror: could not initialize yyy class after deploying on spark standalone cluster
- Spark 0.9.0: worker keeps dying in standalone mode when job fails
- How to set up mesos for running spark on standalone OS/X
- What are the available output formats for writeStream in Spark structured streaming
- Apache Spark: Disable Spark Web UI in Spark Standalone mode
- Spark - jdbc write fails in Yarn cluster mode but works in spark-shell
- Spark cassandra connector doesn't work in Standalone Spark cluster
- Kerberos issue on Spark when in cluster (YARN) mode
- Run Spark in standalone mode with Scala 2.11?
- Spark mesos cluster mode is slower than local mode
- Apache Spark Standalone Cluster Initial Job not accepting resources
- Spark Standalone Cluster deployMode = "cluster": Where is my Driver?
- Spark Dynamic Allocation in Standalone Cluster Failing My Application
- Unable to read local files in spark kubernetes cluster mode
- Why this Spark code works in local mode but not in cluster mode?
- java.lang.ClassNotFoundException when submitting scala app to standalone spark cluster
- Spark Logging in Cluster Mode Prints Nothing
- Zeppelin Cluster Mode not working on spark 1.2 Ambari, Hortonworks CLuster
- No TypeTag available for a case class using scala 3 with spark 3
- configure log4j for each spark job running on yarn mode
More Query from same tag
- Where do SBT env variables come from?
- Optional parameter in Partial Function?
- Looking for an alternate form of windowAll() that keeps data on the same node for aggregations
- Join three tables in Scala Slick or flatten nested tuples
- how to sort List objects based on the object fields in Scala?
- apache flink: how to interpret DataStream.print output?
- Playframework 2 - Add flash on a flashed mvc.Result
- Anorm seq of NamedParameter
- Scala function that returns an anonymous object?
- Scala null pointer exception cause by .toString on a variable which is not null
- scala weird symbol "_@" meaning
- finagle in java - Function, Function1 etc
- A lag function honors how data are sorted, but crosses a group, and this I do not want: I must be mistunderstanding how window and lag functions work
- fs2: Check for success of Seq(Task)
- Scala types and compilation
- Android Pre Dex compiler hanging in Intellij - needs to allocate more memory?
- What to add to my sbt to support Docker image creation for non-play project?
- Flink StreamingFileSink RowFormatBuilder withBucketAssigner returns Any?
- Path-dependent types - what's wrong with the following code?
- Multiply 2 sparse matrices
- Reading data from csv file on the web using scala
- what is '???' in Scala?
- specs2: Is there a way to use doNothing while mocking with Mockito?
- Scala Generic, specify generic through variable
- Scala - Sealed trait grand child
- What is the best way to implement google cloud storage?
- Why Scala Infer the Bottom Type when the type parameter is not specified?
- Play Framework Form Error Handling
- Can we define trait inside a object in scala
- Spark: RuntimeException: java.lang.String is not a valid external type for schema of date