score:0

i am answering this question for sbt configurations. i also got the same issues which i resolved recently and made some basic mistakes which i would like you to note :

1. configure your sbt file

go to build.sbt file and see that the scala version you are using is compatible with spark.as per version 2.4.0 of spark https://spark.apache.org/docs/latest/ ,scala version required is 2.11.x and not 2.12.x . so, even though your ide (eclipse/intellij) shows the latest version of scala or the version you downloaded, change it to compatible version. also, include this line of code

librarydependencies += "org.scala-lang" % "scala-library" % "2.11.6"

2.11.x is your scala version

2. file hierarchy make sure your scala file is under /src/main/scala package only

3. terminal if your ide allows you to launch terminal within it, launch it(intellij allows, not sure of eclipse or any other) or go to terminal and change directory to your project directory

then run :

sbt clean

this will clear any libraries loaded previously or folders created after compilation.

sbt package

this will pack your files into a single jar file under target/scala-/ package

then submit to spark :

spark-submit target/scala-<version>/<.jar file> --class "<classname>(in your case , com.jsonreader.json)" --jars target/scala-<version>/<.jar file> --master local[*] 

note here that -- if specified in a program isnt required here

score:1

hard to say, but maybe you have many classes that qualify as main so the build tool does not know which one to choose. maybe try to clean the project first sbt clean.

anyway in scala the preferred way to define a main class is to extend the app -trait.

object someapp extends app

then the whole object body will become your main method. you can also define in your build.sbt the main class. this is necessary if you have many objects that extend the app -trait.

mainclass in (compile, run) := some("io.example.someapp")

Related Query

More Query from same tag