score:0
Use SBT and try typesafe config library.
Here is a simple but complete sample which reads some information from the config file placed in resources
folder.
Then you can assemble a jar file using sbt-assembly plugin.
score:0
If you're working in the Databricks environment, you can upload the credentials file.
Setting the GOOGLE_APPLICATION_CREDENTIALS
environment variable, as described here, does not get you around this requirement because it's a link to the file path, not the actual credentials. See here for more details about getting the right credentials and using the library.
score:1
You can use --files
argument of spark-submit
or SparkContext.addFile()
to distribute a credential file. If you want to get a local path of the credential file in worker node, you should call SparkFiles.get("credential filename")
.
import org.apache.spark.SparkFiles
// you can also use `spark-submit --files=credential.p12`
sqlContext.sparkContext.addFile("credential.p12")
val credentialPath = SparkFiles.get("credential.p12")
val df = sqlContext.read.
format("com.github.potix2.spark.google.spreadsheets").
option("serviceAccountId", "xxxxxx@developer.gserviceaccount.com").
option("credentialPath", credentialPath).
load("<spreadsheetId>/worksheet1")
Source: stackoverflow.com
Related Query
- Google Spread Sheet Spark library
- What is version library spark supported SparkSession
- call of distinct and map together throws NPE in spark library
- access google cloud storage using java library gets '403 forbidden'
- How to fix "A protocol message was rejected because it was too big" from Google Protobuf in Spark on Mesos?
- Fail to apply mapping on an RDD on multipe spark nodes through Elasticsearch-hadoop library
- Scala IDE and Apache Spark -- different scala library version found in the build path
- Save Spark dataframe as parquet file in Google Cloud Storage
- Spark job SBT Assembly merge conflict when including Spark Streaming Kinesis ASL library
- Google secret manager API and google storage API not working with Apache Spark
- Merging configurations for spark using typesafe library and extraJavaOptions
- Spark SFTP Connector Library - Spark SFTP connection failing
- Scala MurmurHash3 library not matching Spark Hash function
- Broadcasting external library object in Apache Spark
- spark import apache library (math)
- Spark Scala program only prints first line in Google Dataproc
- Check if Google storage bucket or file exists using Spark Scala
- Google Cloud Dataproc - Submit Spark Jobs Via Spark
- Copying Spark Dataframe to Google Cloud Bigtable using Java/Scala
- How to connect to MQTT broker with authorization with MQTT Spark streaming library
- Google dataproc spark jobs failing with "Node was restarted while executing a job." message
- Spark - Can't read files from Google Cloud Storage when configuring gcs connector manually
- Convert to Scala code PEM to DER Certificate Follow it up from Google KMS Library
- Can I read csv files from Google Storage using Spark in more than one executor?
- How to use google pubsub library with scala
- Reading file from Google bucket in Spark
- Is the Streaming k-means clustering predefined in MLlib library of spark supervised or unsupervised?
- Write dataframe to multiple tabs in a excel sheet using Spark
- Spark Sql: Loading the file from excel sheet (with extension .xlsx) can not infer the schema of a date-type column properly
- Unresolve Dependencies Spark library
More Query from same tag
- Scala cannot compile string literal
- Should functions that return Future[A] throw exceptions?
- Reduce computational time in Spark application
- Scala generic implicit val
- Run/Debug Config on IDEA 13 for Scala Application
- Spark MLlib: Difference between DecisionTree.trainClassifier and DecisionTree.train
- Scala adding elements to seq and handling futures, maps, and async behavior
- How do I sort a list of Doubles in Scala?
- Regex to find string with __ and multiple _ and then __ to get desired output
- How to determine which apis to use for the code to be time efficient in spark
- Too many implicits!
- What is implicit evidence in Scala? What is it good for?
- sbt submit gives "not a valid command" error
- How to write a zip function from scratch
- Webservice call in Play2 Scala
- Expression of type scala.collection.Set[scala.Predef.String] doesn't conform to expected type scala.Predef.Set[scala.Predef.String]
- Flatten a Set of Future List into one Future List
- Using EitherT to evaluate the results of operations using a shared error type inheritance?
- Getting Record values from Neo4j Scala
- Scala for comprehension - chaining Future with IndexedSeq
- Overriding Java method in Scala using eclipse
- How to compute for duration in spark-scala
- Finding counts of each distinct element in a Scala array?
- `Unit` Return Type of Closure
- How to call a Scala function with short parameter list?
- What does this block of Scala code mean
- scala parallel set recursive value needs type
- Akka Streams: Calling push(_,_) not within onPull(_,_) is blocking the stream - Why?
- specs2 - how to swallow exceptions
- foldLeft early termination in a Stream[Boolean]?