score:0

Use SBT and try typesafe config library.

Here is a simple but complete sample which reads some information from the config file placed in resources folder.

Then you can assemble a jar file using sbt-assembly plugin.

score:0

If you're working in the Databricks environment, you can upload the credentials file.

Setting the GOOGLE_APPLICATION_CREDENTIALS environment variable, as described here, does not get you around this requirement because it's a link to the file path, not the actual credentials. See here for more details about getting the right credentials and using the library.

score:1

You can use --files argument of spark-submit or SparkContext.addFile() to distribute a credential file. If you want to get a local path of the credential file in worker node, you should call SparkFiles.get("credential filename").

import org.apache.spark.SparkFiles

// you can also use `spark-submit --files=credential.p12`
sqlContext.sparkContext.addFile("credential.p12")
val credentialPath = SparkFiles.get("credential.p12")

val df = sqlContext.read.
    format("com.github.potix2.spark.google.spreadsheets").
    option("serviceAccountId", "xxxxxx@developer.gserviceaccount.com").
    option("credentialPath", credentialPath).
    load("<spreadsheetId>/worksheet1")

Related Query

More Query from same tag