score:2

Accepted answer

In Spark just because you build against a library does not mean it will be included on the runtime classpath. If you added in

--jars  /home/Applications/spark-2.0.0-bin-hadoop2.7/spark-cassandra-connector-assembly-2.0.0-M1-22-gab4eda2.jar

To your spark-submit for your application it would include all of those necessary libraries at runtime and on all of the remote jvms.

So basically what you are seeing is that in the first example none of the connector libraries are on the runtime classpath, in the spark-shell example they are.


Related Query

More Query from same tag