score:0
I have figured it out. We can use variables in buld.sbt and use it to construct the directory name.
val format = new SimpleDateFormat("dd-MM-yy-hhmmss")
val timeStamp : String = format.format(Calendar.getInstance().getTime())
val resultDirectory : String = "target/test-reports/"+timeStamp
testOptions in Test ++= Seq(Tests.Argument(TestFrameworks.ScalaTest, "-o"), Tests.Argument(TestFrameworks.ScalaTest, "-h", resultDirectory))
libraryDependencies += "org.pegdown" % "pegdown" % "1.6.0" % "test"
I tried this earlier but did not work. The reason is every time you change the build.sbt file you need to reload the sbt shell, which I did not earlier.
score:1
Try
libraryDependencies += "org.pegdown" % "pegdown" % "1.6.0",
testOptions in Test ++= Seq(
Tests.Argument(TestFrameworks.ScalaTest, "-o"),
Tests.Argument(TestFrameworks.ScalaTest, "-h", s"target/test-reports-$testDirTimestamp")
)
def testDirTimestamp = {
import java.time.LocalDateTime
import java.time.format.DateTimeFormatter
LocalDateTime.now.format(DateTimeFormatter.ofPattern("yyyy-MM-ddHHmmss"))
}
which after executing sbt test
should create reports under
target/test-reports-2019-07-02074159
Source: stackoverflow.com
Related Query
- How to create new test report directory with timestamp every time I run a test and keep the old test reports using scalatest and sbt
- Saving RDD as textfile gives FileAlreadyExists Exception. How to create new file every time program loads and delete old one using FileUtils
- How to run tests with target directory only (where test classes are compiled to)?
- How to run a single scala test with scoverage?
- Play: How to remove the fields without value from JSON and create a new JSON with them
- How to create a spark dataframe with timestamp
- How to create new instance of Scala class with context bound via Java reflection using only zero argument constructor?
- How can I loop though a set and reassign every item in collection with new value?
- How to create sbt project with an android module and run with Intellij ultimate?
- Scala SBT how to run both test and it test at the same time
- Spark Scala Dataframe How to create new column with two or more existing columns
- how to run scalding test in local mode with local input file
- How to test the page loading time with Gatling
- How to insert TIMESTAMP WITH TIME ZONE into HSQLDB using JOOQ
- How to create a new SBT project with LibGDX?
- How to create a TestActorRef inside a test class for an Actor with constructor params?
- How to create test fixtures for every example of a functional test
- How to create new commands that you can run after running the sbt command Scala
- How do I run test groups with Scala + JUnit runner + Maven?
- How do I merge string with a Row to create a new spark dataframe when Row schema is unknown while coding?
- How to create a new column for dataset using ".withColumn" with many conditions in Scala Spark
- How to create new empty column in Spark by Scala with just column name and data type
- How to create a new sequential timestamp column in a CSV file using Spark
- how to create a new play project with sbt
- How do I run sbt test with scalatest? I had an error: object scalatest is not a member of package org
- How to create SBT project with IntelliJ Idea?
- How to create an empty DataFrame with a specified schema?
- How to create a list with the same element n-times?
- Do I need to re-use the same Akka ActorSystem or can I just create one every time I need one?
- Create new column with function in Spark Dataframe
More Query from same tag
- Scala match syntax: What is the block like structure
- Unable to draw multiple instace of same Image with GraphicsContext
- Not able to apply function to Spark Dataframe Column
- How to design a partial function that accepts the all domain, but dependent on some state?
- Box Java SDK - 404 File Not Found on publicly available document
- Params list in Scala function. Can someone explain the code?
- If the Nothing type is at the bottom of the class hierarchy, why can I not call any conceivable method on it?
- How can I transform only the top level nodes of an xml file in scala (not the descendants)
- Spark write the output to one file in foreach batch
- Iterating on org.apache.spark.sql.Row
- Divergent implicit expansion when abstracting over Traversable
- How can I get the elements of list using ListBuffer
- scala Array[Byte] diff
- Actor design for queue with a long running task
- Spark Structured Streaming Application produces empty parquet files to Azure blob
- How to substract values from all keys in Pair Rdd?
- Is there some way to use MacroImplementations code for "f" string interpolator in my macro?
- Jetty does not launch deployed war
- Spark scala - calculating dynamic timestamp interval
- Scala: how to determine if a type is nullable
- Mapping List[Int] to Chars
- Possible to make a producer/consumer obj in Scala using actors 'threadless' (no receive...)?
- Converting nested ArrayType of String to Nested ArrayType of date using UDF spark
- How do I get IntelliJ to display the Scala compilation deprecation warnings
- sbt 1.4.4 with IntelliJ: Extracting structure failed
- How to used named parameters with a curried function in scala
- Equality of functions in Scala, is functions objects in Scala?
- How to generate any protobuf Enum as String with only one TypeMapper?
- why (self: Runnable =>) in scala compiles?
- Slick filter when comparing value and column is Option