In Spark just because you build against a library does not mean it will be included on the runtime classpath. If you added in
To your spark-submit for your application it would include all of those necessary libraries at runtime and on all of the remote jvms.
So basically what you are seeing is that in the first example none of the connector libraries are on the runtime classpath, in the spark-shell example they are.
- Spark running on Cassandra fails due to ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition (details inside)
- DataStax Spark Cassandra Connector fails to connect when mixing contact points across regions
- Running spark scala example fails
- Spark ClassNotFoundException running the master
- Running spark inside intellij idea HttpServletResponse - ClassNotFoundException
- Cassandra Spark Datastax Replica
- Has anyone been successful running Apache Spark & Shark on Cassandra
- Datastax spark cassandra connector - writing DF to cassandra table
- Embedded Cassandra with Spark and DataStax Driver Error
- Codec not found for requested operation: [varchar <-> java.lang.Double], Inserting to Datastax cassandra from spark structure
- Spark unit test fails due to stage failure
- running a spark application fails on EC2 with hadoop IncompatibleClassChangeError
- Spark Job fails at saveAsHadoopDataset stage due to Lost Executor due to some unknown reason
- Running complex SQL queries on Cassandra tables using Spark SQL
- Spark + Cassandra connector fails with LocalNodeFirstLoadBalancingPolicy.close()
- How does DataStax Spark Cassandra connector create SparkContext?
- Datastax spark cassandra connector with RetryPolicy to write DF to cassandra table
- Scala Spark job to upload CSV file data to mongo DB fails due to CodecConfigurationException
- DataStax Cassandra - Scala Spark Application - SBT Build failure
- Running scala spark 1.6 job on spark 2.1 fails
- Spark Cassandra application always running mode
- Cassandra cluster is running but not alble to connect from Spark App
- How to solve "Can't assign requested address: Service 'sparkDriver' failed after 16 retries" when running spark code?
- "sparkContext was shut down" while running spark on a large dataset
- Why do we need to add "fork in run := true" when running Spark SBT application?
- Running a Job on Spark 0.9.0 throws error
- UnsatisfiedLinkError: no snappyjava in java.library.path when running Spark MLLib Unit test within Intellij
- 'Connection Refused' error while running Spark Streaming on local machine
- Error in running job on Spark 1.4.0 with Jackson module with ScalaObjectMapper
- Error in running Spark in Intellij : "object apache is not a member of package org"
More Query from same tag
- Scala Generic Type as Class Parameter and matching (& Slick)
- Multidimentional array of arrays in Scala
- Mongo Scala Play - java.lang.NoSuchMethodError: com.mongodb.ConnectionString.getApplicationName()Ljava/lang/String;]
- How do I create an Akka Source[String] from a text file?
- How to transpose many files Given that linesWithFileNames: RDD[(Path, Text)], which Text contains a matrix?
- How to remove entries that contain empty values in an RDD that contains csv data?
- scala : use of braces for a function when the parameter is a predicate
- Scala How to transform Seq[((A, B, C, D), Seq[(E, F, G)])] to Seq[(A, B, C, D, Seq[(E, F, G)])]?
- Should I report this assertion failure? What is $asInstanceOf anyway?
- Multiple flatMap methods for a single monad?
- lift a value into a ValidationNel
- How to use GroupByKey in Spark to calculate nonlinear-groupBy task
- Override a class variable from parent classes function
- Scala - Combine Substrings from a Vector using Reduce
- How do I use java.lang.Integer inside scala
- Return type in If expression
- How do I detect if a Spark DataFrame has a column
- scala extends java generic array failure
- difficulty in understanding the synchronized definition
- Checking message query by actor in between calculations in scala/akka
- What is "type" of a function with implict?
- Why is there a type mismatch when unioning a collection of Datasets
- Lambda type inference and implicit conversions
- Scalatest throws weird error on scala 2.11
- Scala Parboiled 2 currying up some rules
- Reading .conf file from AWS s3 through spark and scala
- DI or Service Locator : Injecting implementations at run-time ( no static binding ) in scala
- How to execute scala tests programmatically
- Cats.validate, what's the difference between ValidatedNel[+E,+A] and Validated[+E,+A]
- Writing a generic 'fill' method