score:17
You are mixing things up.
So then I looked at this question which says...
- You'll need to load the driver somewhere. Class.forName("org.postgresql.Driver");
- You'll need the postgresql driver .jar file in the classpath of your program.
This is not applicable in this case. You have a framework that takes care of these things for you. That question you refer to is describing how to access a database with "raw" jdbc.
Here's how you should do it.
First of all you can simplify the configuration part. 5432
is the default port for postresql and localhost
is also the default host. Username and password should be placed outside of the url.
# Database configuration
db.default.driver=org.postgresql.Driver
db.default.url=jdbc:postgres:postgres
db.default.user=user
db.default.password=password
Now define proper sbt dependencies. Just remove the jar files from the /lib
folder and update your dependencies to get the latest PostgreSQL driver (9.2) by changing your Build.scala
appDependencies. Be aware that the groupId has changed from postgresql
to org.postgresql
.
val appDependencies = Seq(
// Add your project dependencies here,
jdbc,
"com.typesafe.slick" %% "slick" % "1.0.0",
"org.postgresql" % "postgresql" % "9.3-1102-jdbc41"
)
And finally you should change your controller to resolve the datasource from configuration:
def instance = Action {
Database.forDataSource(DB.getDataSource()) withSession {
val q = Retailer.map(_.name)
Ok(views.html.instance(q.list, newRForm))
}
}
Source: stackoverflow.com
Related Query
- Why does Play action fail with "no suitable driver found" with Slick and PostgreSQL?
- Why does Play 2.3.4 and jacoco4sbt > 2.1.4 fail with NoSuchMethodError?
- Why does Play project fail with "implicit value ... is not applicable here" after upgrading to Slick 3.0?
- Play 2.1 Unit Test With Slick and Postgres
- Why does from_json fail with "not found : value from_json"?
- play slick database configuration issues with postgres driver
- Why does Spark with Play fail with "NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$"?
- Why does sbt assembly in Spark project fail with "Please add any Spark dependencies by supplying the sparkVersion and sparkComponents"?
- Why does apply with function parameter fail for println and not print?
- Why does from_json fail with “not found : value from_json"? (2)
- Why does sbt fail with sbt.ResolveException: unresolved dependency for Spark 2.0.0 and Scala 2.9.1?
- Why does creating a Dataset with LinearRegressionModel fail with "No Encoder found for org.apache.spark.ml.regression.LinearRegressionModel"?
- Why does the compiler fail with "Cannot resolve reference Action with such signature"?
- Why does join fail with "java.util.concurrent.TimeoutException: Futures timed out after [300 seconds]"?
- Why does Spark fail with java.lang.OutOfMemoryError: GC overhead limit exceeded?
- Why does sbt compile fail with StackOverflowError?
- Why does Spark fail with "Detected cartesian product for INNER join between logical plans"?
- Why does sbt build fail with "MissingRequirementError: object scala.runtime in compiler mirror not found."?
- Why does Spark application fail with “ClassNotFoundException: Failed to find data source: kafka” as uber-jar with sbt assembly?
- Why does 2.11.1 fail with error: not found: type Application?
- Why does Scala compiler fail with "no ': _*' annotation allowed here" when Row does accept varargs?
- Why does using cache on streaming Datasets fail with "AnalysisException: Queries with streaming sources must be executed with writeStream.start()"?
- Why does the build fail with unresolved dependency: com.typesafe.sbt#sbt-native-packager;0.7.4?
- Why does my takeWhile fail to work with my Stream
- Why does sbt assembly fail with "Not a valid command: assembly"?
- In Scala, why does getInstance fail to work with GregorianCalendar?
- Why does Spark fail with "Failed to get broadcast_0_piece0 of broadcast_0" in local mode?
- Why does Akka fail with "IllegalStateException: cannot create children while terminating or terminated" when testing with ScalaTest?
- How to configure Slick 3.0.0 for Postgres DB (either with Hikari or without) Typesafe Play conf
- Why does running tests through jenkins user on build slave fail with Missing scala-library.jar?
More Query from same tag
- scala parser combinator infinite loop
- Why is "abstract override" required not "override" alone in subtrait?
- How to import firestore's RPC api/definitions in a .proto file?
- How do I set schema for map field in big query?
- Scala traits using generics and case class methods
- Converting vector of vectors to Matrix in scala
- Scala - Pattern matching result of a for comprehension
- How to send a parameter to a custom action?
- Scala receive (actors) not receiving anything?
- Implicits and blackbox macros
- SBT: Plugin dependencies and project classpath
- Why doesn't Java have a REPL?
- Scala object that extends Java class
- Scala Set match case
- ZIO Json: Failing Fiber on List in Json
- How to group objects using a classifier function in FS2?
- Spark insertInto Java Heap Space
- Spark - Reduce List of key-value pair in Scala
- Problems with Pattern matching, implementing SplitAt in scala
- Calling a web service that depends on a result from another web service in play framework
- Slick Reactive Streams with multiple joins
- Scala sbt-assembly, unable to get resource for org/pantsbuild#jarjar;1.6.5
- Load bar with Akka streaming
- How to create a for loop in Scala to compare 2 spark dataframes columns and their values
- Scala conditional None value assignment
- TaskSchedulerImpl: Initial job has not accepted any resources. (Error in Spark)
- `zipWithIndex` or not: how much do we gain?
- null pointer exception: scalar extension of a java class
- Run time exception while trying to json serialize a 'val' or 'lazy val' Tuple field in grand parent Trait, in scala using json4s
- Scala - Update mutable maps inside immutable Vector