A rather stupid solution, so please do add better answers:
$ mv ~/.sbt/plugins/build.sbt ~/Desktop $ sbt publish-local
This will try to reload the plugins, but fantastically asking them to come in flavour Scala 2.9.2 instead of Scala 2.9.1. So it will end up will problems finding the plugins. Then go back:
$ mv ~/Desktop/plugins.sbt ~/.sbt/plugins/build.sbt $ sbt publish-local
Re-downloads the plugins for Scala 2.9.1, and the GPG plugin is enabled. Voilà.
Edit: Never ever try to move the file into the base directory
~/.sbt/ – you will screw up so much.
- How do I force SBT plugins and plugins to be downloaded through Nexus?
- How to force SBT to use Java 8?
- How can I force spark/hadoop to ignore the .gz extension on a file and read it as uncompressed plain text?
- SBT plugins not using custom resolvers
- How to force sbt to fetch everything it needs once?
- What sbt plugins are available for frequent (multiple times per day) releases of a project from git?
- How can I force which Scala version SBT should use?
- How do i enable plugins in sbt subproject build file?
- read from stdin in sbt task
- Cross-build sbt plugins against different launcher versions?
- Can't fetch plugins from repositories with scala's sbt
- How to force SBT to use Javadoc instead of Scaladoc?
- Minimal full SBT project with plugins
- Can NetBeans 8.0 with Scala and sbt plugins installed create sbt project?
- Why does SBT 0.12.2 resolve plugins with Scala 2.9.2 and ignore scalaVersion in build.sbt?
- how to get a dependency graph of all sbt plugins brought in transitively into one's project?
- Why should I use front-end plugins with my sbt playframework project?
- how to download sbt plugins from nexus?
- Add plugins under a same project in sbt
- SBT Plugin that adds Plugins
- Can sbt be used to access a none-scala github repo to read into a scala project?
- How to enable sbt plugins of root project for all subprojects?
- Sbt / javaAgents / force jar-with-dependencies
- Intellij 13 with sbt 13.1 unable to resolve global-plugins where no global plugins exist
- How to force sbt to resolve dependencies with scalaVersion compatible with the dependent project
- How to force SBT installer with JDK 1.7 to always use HTTPS protocols that are accepted by Maven?
- Where do I put global plugins for SBT 0.13.13 on Windows?
- Enable and disable plugins in sbt file
- Read config file while generating Slick code via SBT
- Spark SBT program tries to read from local file system instead of hdfs in IntelliJ Project
More Query from same tag
- How to get Java's int.class in Scala?
- How to define cast and implicit cast operations for a Scala class of mine?
- Should I use returns in multiline Scala methods?
- Streaming a CSV file to the browser in Spray
- Scalatest or specs2 with multiple test cases
- elastic4s: How to automagically read document id into case class instance?
- Where is that functor which free monad is supposed to be based on
- Scala: java.lang.VerifyError - Incompatible argument to function - runtime error
- Play Framework custom views directory
- Java compile speed vs Scala compile speed
- How to convert functions raising exceptions to functions returning Either?
- Trait as Singleton
- The Actor DSL example from Akka doc
- Is there a do-until (postcondition) loop in Scala 2.8?
- Scala - select elements from ordered list
- Multi-threading in Scala -- dealing only with immutability
- How to create&sample routes using OSM
- Filtering a spark dataset
- In scala, use function name as return value?
- How do I parse JSON with a list of integers using Reads
- spark save simple string to text file
- How to implement Kmeans evaluator in Spark ML
- Scala - Difference between Supervisor and Try-catch error handling strategies
- Tail recursion - Scala (any language else)
- "Modular" Scala guidelines
- Java Generics question mark in Scala
- Scala: How to Pattern Match a Union of Parameterized Types
- How do i access and store two dimension list value in scala?
- Error: A JNI error has occurred, please check your installation and try again in IntelliJ IDEA for Scala-Spark Program using SBT
- Using mapPartitions to avoid the shuffle with groupby and count