score:0
this error means that you don't have next in resources:
meta-inf.services/org.apache.spark.sql.sources.datasourceregister
that should be a part of ignite-spark dependency.
so what you should check:
1)that ignite-spark-2.7.0.jar exists in the classpath of all your nodes where you have spark nodes.
2)in case if you use spark.driver.extraclasspath then please check that:
a. you do it client mode (--deploy-mode client) because spark fires up a netty http server which distributes the files on startup for each of the worker nodes. in cluster mode spark selected a leader worker node to execute the driver process on. this means the job isn't running directly from the master node.
b. i am not sure but looks like extraclasspath requires the list of jar files instead of /path/to/lib/*. you can try to use next:
executor_path=""
for eachjarinlib in $jars ; do
if [ "$eachjarinlib" != "applicationjartobeaddedseperately.jar" ]; then
executor_path=file:$eachjarinlib:$executor_path
fi
done
spark-submit --deploy-mode client --master yarn --conf "spark.driver.extraclasspath=$executor_path" --class $example_class $path_to_jar
where $jars is a path to your libs.
score:1
spark.executor.extraclasspath /opt/ignite/libs/:/opt/ignite/libs/optional/ignite-spark/:/opt/ignite/libs/optional/ignite-log4j/:/opt/ignite/libs/optional/ignite-yarn/:/opt/ignite/libs/ignite-spring/*
i think this is the real issue.
seems like you have to lower version of ignite.
<dependency>
<groupid>org.apache.ignite</groupid>
<artifactid>ignite-spark</artifactid>
<version>2.6.0</version>
</dependency>
you can see (source):
<dependency>
<groupid>org.apache.spark</groupid>
<artifactid>spark-core_2.11</artifactid>
<version>2.3.0</version>
<scope>compile</scope>
</dependency>
also see
1) ignite-8534 they fixed in 2.6 version of ignite
2) discussion-upgrade-ignite-spark-module-s-spark-version-to-2-3-0
call the below func in your driver which will give all the classpath entries to debug what jars in your classpath. in this ignite-spark
jar should be present at runtime
caller would be...
val urls = urlsinclasspath(getclass.getclassloader).foreach(println)
def urlsinclasspath(cl: classloader): array[java.net.url] = cl match {
case null => array()
case u: java.net.urlclassloader => u.geturls() ++ urlsinclasspath(cl.getparent)
case _ => urlsinclasspath(cl.getparent)
}
if you want to add jar dependencies with out using wildcards you can see my answer which will add all the jars from a folder dynamically to path you provided.
spark spark-submit --jars arguments wants comma list, how to declare a directory of jars?
you should have the above mentioned ignite-spark jar in this folder. /home/user/apache-ignite-2.7.0-bin/libs/optional/ignite-spark/* use the above approach and add the jar by folder.
Source: stackoverflow.com
Related Query
- Failed to find data nodes for cache in Apache Ignite
- Connection between kafka and spark : Failed to find data source : kafka
- Failed to find data source: ignite
- Why does Spark application fail with “ClassNotFoundException: Failed to find data source: kafka” as uber-jar with sbt assembly?
- Spark 2.3.0 Failed to find data source: kafka
- Why does Spark application fail with "ClassNotFoundException: Failed to find data source: jdbc" as uber-jar with sbt assembly?
- Using SBT to build scala app - java.lang.ClassNotFoundException: Failed to find data source: org.apache.spark.sql.cassandra
- Why "java.lang.ClassNotFoundException: Failed to find data source: kinesis" with spark-streaming-kinesis-asl dependency?
- Failed to find data source: com.databricks.spark.xml. Please find packages at http://spark.apache.org/third-party-projects.html
- Failed to load data source for config using Play-2.6 and Quill.io
- java.lang.ClassNotFoundException: Failed to find data source: org.apache.bahir.sql.streaming.mqtt.MQTTStreamSinkProvider. Please find packages
- Spark streaming 2.4.0 getting org.apache.spark.sql.AnalysisException: Failed to find data source: kafka
- Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: jdbc
- Unable to Send Spark Data Frame to Kafka (java.lang.ClassNotFoundException: Failed to find data source: kafka.)
- apache spark 1.6+ Failed to load class for data source libsvm
- Spark Scala : Failed to find data source: kafka
- How to use JDBC source to write and read data in (Py)Spark?
- How can I find the source file that causes a Scalac Compiler crash
- Which exception to throw when I find my data in inconsistent state in Scala?
- How to find source of scala.MatchError?
- Find size of data stored in rdd from a text file in apache spark
- How to find the name of the enclosing source file in Scala 2.11
- Where to find data inside a RDD in a eclipse Spark scala debug session?
- Where do I find purely functional open source projects written in Scala?
- Text data source supports only a single column, and you have 8 columns
- How to find exact median for grouped data in Spark
- spark write: CSV data source does not support null data type
- DataType.fromJson() Error: java.lang.IllegalArgumentException: Failed to convert the JSON string 'int' to a data type
- Function to find overlapping data in Spark DataFrame
- DataType.fromJSON() - java.lang.IllegalArgumentException: Failed to convert the JSON string <> to a data type
More Query from same tag
- Use cases for using multiple queries for Spark Structured streaming
- How to sort sequence of numeric data type in scala
- What would be a good approach to generate native code from an interpreter written with scala parser combinators?
- Play! 2 WS library : Detect and Handle Closed Connection in Streaming HTTP response
- Structural typing implementation of OCaml, Scala, and Go
- Scala type system, cannot find common ancestor inline
- Spark: ClassNotFoundException when running hello world example in scala 2.11
- Instantiating Akka actors in a Play application with Subcut
- How to sort by value in spark Scala
- How to return Future as json?
- Limit kafka batch size when using Spark Structured Streaming
- How to judge wheather a value exists in scala?
- Class java.time.Duration not found - continuing with a stub
- How to skip a failed step in spark job
- Apache Spark Broadcast variables are type Broadcast? Not a RDD?
- java.lang.NoClassDefFoundError: org/apache/spark/streaming/twitter/TwitterUtils$ while running TwitterPopularTags
- Scala variable parameter count Either
- Scala Programming: Normalizing a 3D vector with length 0
- Scala: how to transform sequence of strings into a sequence of tuples by splitting strings
- Retrieving Key From Map In Scala Using A Value
- Add method to library with scala implicit
- How to call pages dynamically play scala using Intellij idea
- Retrieve the local IP and port from an outbound Akka TCP stream
- How to reference outer object from inner class in Scala
- Use SparkSession.sql() with JDBC
- Getting json from URL in Scala
- Select a column based on another column's value in Spark Dataframe using Scala
- Spark in Kryo : "java.lang.StackOverflowErrorat com.esotericsoftware.kryo.util.IdentityObjectIntMap.put"
- Replicate Spark Row N-times
- Play framework file upload and CSRF still not working