score:4
first, defining samplekeya in compile is of course valid, since it scopes the setting to the compile task.
second, you get value 1, because you are using samplekeya without the above scope. change it to samplekeya in compile and you will get value 2.
to see this, just start an "empty" sbt session and execute the following:
> set settingkey[string]("samplekeya") := "value 1"
[info] reapplying settings...
[info] set current project to default-a57b70 (in build file:/users/heiko/tmp/sbt/)
> set settingkey[string]("samplekeya") in compile := "value 2"
[info] reapplying settings...
[info] set current project to default-a57b70 (in build file:/users/heiko/tmp/sbt/)
> samplekeya
[info] value 1
> samplekeya(for compile)
[info] value 2
score:2
it picked value 1
because value 2
is scoped in compile, but you got the general version. if you wrote samplekeya in compile
, it would then work. or, perhaps, in compile
-- i think that declaration is incorrect, as the scope compile
doesn't really exist.
Source: stackoverflow.com
Related Query
- Why doesn't sbt pick up the scoped value here?
- Why does Scala choose the type 'Product' for 'for' expressions involving Either and value definitions
- Why does sbt report "not found: value PlayScala" with Build.scala while build.sbt works?
- Why is the message "The global sbt directory is now versioned" in 0.13?
- Why "set" can't assign value to custom SettingKey I can "show" in sbt shell?
- When overriding a trait, why the value is strange?
- Pick out the Nth element of a HList of Lists and return that value as a HList of values
- Why not set the value directly
- Why does sbt download a different Scala version than the one in build.sbt?
- Why is Deferred factory method has return value in the context of F
- Why is it impossible to specify the default value of a Scala varargs parameter?
- Why does using the '~' operator in scala give me a negative value
- Spark word count example : why error: value split is not a member of Char at the REPL?
- Odersky's Coursera Lecture: Why can't the type-checker infer value type Int?
- Scala: Why the mutable value inside Map cannot be changed if the Map is created from GroupBy
- how to configure SBT to pick the desired scala version (2.9)
- Why does sbt fail to run a gui application for the second time?
- Why does sbt assembly in Spark project fail with "Please add any Spark dependencies by supplying the sparkVersion and sparkComponents"?
- Why does SBT compile just the last subproject?
- Value lookup resolves to a wrong scope in the sbt plugin
- Why is the original scala sbt project recompiled by when I build an exact copy of it in an other folder?
- Why doesn't my recursive function return the max value of a List
- Why does Map addition of duplicates only take the last key's value
- Why does a Try work here when the parameter does not expect a Try?
- Why does the value of an expression depend on the variable it's assigned to?
- Why does collect fail with "error: value collect is not a member of Int" on the result of reduce?
- JogAmp / JOGL - how does sbt pick up the right native jar?
- Is there a reason why sbt does not support the developer key?
- why the scala programe created by singleton object doesnt require a static main method?
- Why does sbt not choose the most recent dependency version within range?
More Query from same tag
- Is DAO pattern obsolete in Scala?
- Specializing Generic Sealed Types
- Regular Expression to find a string included between two brackets
- How do I get started with Scala
- Restrict Object Type's path at CompileTime
- Testing Scala Objects
- How to calculate TP, TN, FP and FN with spark and scala when I have predictions and ground truth file ( original graph )?
- How to return wildcard generic?
- A journey from akka-stream to fs2 - how to define an akka-stream http flow like stage in fs2 using http4s
- How to use correctly mapPartitions function
- What is the meaning of >: Null <: in Scala?
- Play 2.2 JSON Reads with combinators: how to deal with nested optional objects?
- Apache Kafka: How to receive latest message from Kafka?
- spark launch : find version
- android + scala + obfuscate code
- Call Scala function from nodejs
- call python script function from scala
- <not computed> on collection output in Scala
- Does it make sense to return Try[Option[String]]?
- How to create generated objects in shapeless
- Inject methods into existing classes
- Scala - Two Lists to Tuple List
- apache spark NullPointerException on RDD.count
- flink How to deserialize trait to case
- Heroku does not seem to run most recent version of deployed play application
- Programming on Lift on Realtime - Is it possible?
- Spark Dataframe(Scala) to concatenate arrays(as StructField) within StructType
- Scala - Abstract type in constructors
- IntelliJ IDEA android scala application
- Return type eliminated by erasure in Scala