score:0

this error means that you don't have next in resources:

meta-inf.services/org.apache.spark.sql.sources.datasourceregister

that should be a part of ignite-spark dependency.

so what you should check:

1)that ignite-spark-2.7.0.jar exists in the classpath of all your nodes where you have spark nodes.

2)in case if you use spark.driver.extraclasspath then please check that:

a. you do it client mode (--deploy-mode client) because spark fires up a netty http server which distributes the files on startup for each of the worker nodes. in cluster mode spark selected a leader worker node to execute the driver process on. this means the job isn't running directly from the master node.

b. i am not sure but looks like extraclasspath requires the list of jar files instead of /path/to/lib/*. you can try to use next:

executor_path=""
   for eachjarinlib in $jars ; do
if [ "$eachjarinlib" != "applicationjartobeaddedseperately.jar" ]; then
       executor_path=file:$eachjarinlib:$executor_path
fi
done
spark-submit --deploy-mode client --master yarn --conf "spark.driver.extraclasspath=$executor_path" --class $example_class $path_to_jar

where $jars is a path to your libs.

score:1

update : as mentioned in the ignite deployment docs you should also mention executor classpath along with driver classpath

spark.executor.extraclasspath /opt/ignite/libs/:/opt/ignite/libs/optional/ignite-spark/:/opt/ignite/libs/optional/ignite-log4j/:/opt/ignite/libs/optional/ignite-yarn/:/opt/ignite/libs/ignite-spring/*

i think this is the real issue.


http://apache-ignite-users.70518.x6.nabble.com/spark-ignite-connection-using-config-file-td21827.html

seems like you have to lower version of ignite.

for ignite 2.6:

<dependency> 
    <groupid>org.apache.ignite</groupid> 
    <artifactid>ignite-spark</artifactid> 
    <version>2.6.0</version> 
</dependency> 

you can see (source):

  <dependency> 
      <groupid>org.apache.spark</groupid> 
      <artifactid>spark-core_2.11</artifactid> 
      <version>2.3.0</version> 
      <scope>compile</scope> 
    </dependency> 

also see
1) ignite-8534 they fixed in 2.6 version of ignite
2) discussion-upgrade-ignite-spark-module-s-spark-version-to-2-3-0

call the below func in your driver which will give all the classpath entries to debug what jars in your classpath. in this ignite-spark jar should be present at runtime

caller would be...

val  urls = urlsinclasspath(getclass.getclassloader).foreach(println)


def urlsinclasspath(cl: classloader): array[java.net.url] = cl match {
    case null => array()
    case u: java.net.urlclassloader => u.geturls() ++ urlsinclasspath(cl.getparent)
    case _ => urlsinclasspath(cl.getparent)
  }

if you want to add jar dependencies with out using wildcards you can see my answer which will add all the jars from a folder dynamically to path you provided.

spark spark-submit --jars arguments wants comma list, how to declare a directory of jars?

you should have the above mentioned ignite-spark jar in this folder. /home/user/apache-ignite-2.7.0-bin/libs/optional/ignite-spark/* use the above approach and add the jar by folder.


Related Query

More Query from same tag