score:2
You need to use a multi-project SBT build file. That is, have one build.sbt
file for the entire project. IntelliJ should honor that without a problem.
I think your build.sbt
file, which should go in your project's root directory, would need to look like this:
lazy val commonSettings = Seq(
scalaVersion := "2.12.1",
version := "0.1"
)
lazy val common = project.in(file("sbt_testing/common")).
settings(commonSettings: _*).
settings(
name := "common"
)
lazy val hello = project.in(file("sbt_testing/hello")).
dependsOn(common).
settings(commonSettings: _*).
settings(
name := "Hello"
)
As you can see, you can group settings common to both projects and also ensure better consistency between them.
You will need to remove the build.sbt
files in sbt_testing/common
and sbt_testing/hello
.
UPDATE: Corrected use of commonSettings
in the SBT build.sbt
file. Apologies for the confusion!
UPDATE 2: In order to run the code in the HelloWorld
class, you will need to do the following (I also renamed the "root" project to "hello"):
$ sbt
[info] Loading global plugins from /home/user/.sbt/0.13/plugins
[info] Loading project definition from /home/user/src/multiSbt/project
[info] Set current project to multisbt (in build file:/home/user/src/multiSbt/)
> project hello
[info] Set current project to Hello (in build file:/home/user/src/multiSbt/)
> run
[info] Compiling 1 Scala source to
/home/user/src/multiSbt/sbt_testing/hello/target/scala-2.12/classes...
[info] Running HelloWorld
Hello, world!
This is from common: testing 123
[success] Total time: 3 s, completed May 1, 2017 3:27:16 PM
For more information on SBT multi-project builds, refer to the official documentation...
Source: stackoverflow.com
Related Query
- How can I use scala sources from a different location in a sbt project and also make it work with IntelliJ IDEA?
- How can I use memcached (Amazon elasticache) from both scala and rails simultaneously?
- How can I use the Scala REPL to test java code - java and scala REPL giving different outputs
- How can I change version of Scala that is used by Play, SBT and its plugins?
- How can I idiomatically "remove" a single element from a list in Scala and close the gap?
- How can I get colored output from Maven and SBT on Windows?
- How to use plugin in sbt project when only the plugin's sources available?
- How to synchronize Intellij and sbt builds for a scala project
- How can I select a non-sequential subset elements from an array using Scala and Spark?
- how can I interop kotlin code in an existing SBT scala project
- How do I make sbt use the scala binaries from $SCALA_HOME directory?
- How to develop and run spark scala from vscode using sbt or Metals
- How can I use OrientDB from a Scala / Play 2.2 project?
- How can I add jars from more than one unmanaged directory in an SBT .scala project configuration
- How to use SBT IntegrationTest configuration from Scala objects
- How to setup scala sbt project for nd4j and deeplearning4j
- How to use AES Encryption and Decryption from Scala to Python
- How can I pass a Scala UserDefinedFunction where output is a complex type (using StructType and StructField) to be used from Pyspark
- How can I shield implementation level classes, traits and objects from Scala public APIs?
- How can I serve mostly static pages from a scala Play! application like an "About" page in different languages?
- How to get Intellij to use dependencies from SBT scala
- What's the best way using Scala and SBT to create samples for a library that can be built from the SBT CLI?
- How to read from textfile(String type data) map and load data into parquet format(multiple columns with different datatype) in Spark scala dynamically
- How can I override tasks ``run`` and ``runMain`` in SBT to use my own ``ForkOptions``?
- How to fix "origin location must be absolute" error in sbt project (with Spark 2.4.5 and DeltaLake 0.6.1)?
- How can I use the last result from a scala map as input to the next function?
- How to read and write from MongoDb using two different drivers (MongoDB Scala Driver and Salat)
- How to use sbt to specify dependsOn other module with different scala version?
- Scala and Play: how can I use my implicit object?
- How can sbt generate several jars with different classifiers and different main classes?
More Query from same tag
- What really happens behind the Scala runtime/REPL when running a '.scala' program?
- Scala: pattern matching Option with case classes and code blocks
- Kafka exceptions when loading more than 1 consumer of the same group via spark streaming
- Play Framework Form validation messages with Twitter Bootstrap fields
- Split Akka Stream Source into two
- Mocking time for Scala Deadline during unit testing
- finding the count of times element of one list is greater than the element of another list in scala
- How do I access an overridden data member in Scala?
- Scala: Why java.lang.IllegalArgumentException: Cannot format given Object as a Date?
- How to calculate Binary Classification Metrics in Spark MLlib with Dataframe API
- How can I 'discover' type classes/implicit values in the current scope?
- NoClassDefFoundError while executing spark-submit for KafkaProducer
- Gremlin with Neptune : Timed out while waiting for an available host
- Play Framework with Spring Security?
- Constructing Scala parallel views with X.par.view vs X.view.par?
- Regular Expression matching filename in Scala
- Task not serializable in Flink even with POJO
- Scala, Morphia and Enumeration
- How can I control number of rows and/or output file size in Spark streaming when writing to HDFS - hive?
- Processing DynamoDB streams using the AWS Java DynamoDB streams Kinesis adapter
- Conditional Spark map() function based on input columns
- Does `rdd.map(x => f(g(x))` has better performance than `rdd.map(g).map(f)`?
- Sbt alternative to "mvn install -DskipTests"
- Scala - create a new list and update particular element from existing list
- Display image from MongoDB using reactivemongo in Play2 Scala
- error while loading <root>, Error accessing .ivy2/cache/org.apache.spark/spark-core_2.11/jars/spark-core_2.11-1.4.0.jar
- How can I ignore non-matching preceding text when using Scala's parser combinators?
- Spark dataset map(identity) not serializable on Scala
- Why this code compiles and it gives runtime error when it executes
- Reflective call occurred when using Aux pattern with type selection