score:4
All that an import statement does is to bring a name into scope so you can refer to it by its unqualified name. That is, importing DataFrame
allows you to write just DataFrame
instead of having to write its full name org.apache.spark.sql.DataFrame
.
So if you're okay with writing the full name or you're not writing the name at all, you don't need the import. In your first code the latter is the case.
PS: You don't need a return statement at the end of your method. The last expression in a method is returned automatically.
score:1
def foo(path:String) : org.apache.spark.sql.DataFrame = {
val df = ...
return df
}
will work without import. You need to import only if you want to use short names.
Source: stackoverflow.com
Related Query
- In Scala, why do I need to import something only if I use it as a return type?
- Should I use Unit or leave out the return type for my scala method?
- Why to use empty parentheses in Scala if we can just use no parentheses to define a function which does not need any arguments?
- Why does Scala require a return type for recursive functions?
- Why does Scala maintain the type of collection not return Iterable (as in .Net)?
- Scala and Play-Framework exposing REST services. Now to render (but need to use something other then Scala)
- Why do I need an explicit evidence type / why does this Scala type bound fail?
- Why is there only 1 type parameter in scala list flatmap parameter signature
- Why type inference failed in Scala when using implicit conversion on return value of Option.getOrElse?
- Enforce Scala Seq to use a single type only (no LUB)
- In Scala, why can I use `Unit` as return type here?
- collect vs collectFirst - why the return values are of different type - Scala
- need clarity about use of single underscore only in scala
- Why is return type needed on overloaded scala function?
- Why scala cannot infer common return type for a function with multiple parameter lists?
- Should I use Try as return type in Scala services?
- Use self type as return type in Scala trait
- Why is Scala unable to infer the type of an anonymous function when there is method overloading and only one method takes a function?
- How to use nested generic types in Scala as method return type
- why is the use of return a bad habit in scala
- F# vs Scala - why do I need to tell F# the type of things?
- Scala covariant type declaration with an upper bound, why do methods need to repeat the upper bound exlplicitly
- Why do I need to import driver to use lifted queries?
- use dynamic return type in a scala fuction
- Scala Macro Annotations - Why doesn't my class appear to be updated when I use it as a type parameter?
- What are Scala continuations and why use them?
- When and why should one use Applicative Functors in Scala
- Scala type keyword: how best to use it across multiple classes
- How are Scala collections able to return the correct collection type from a map operation?
- How to use Scala in IntelliJ IDEA (or: why is it so difficult to get a working IDE for Scala)?
More Query from same tag
- Combining/joining rows of IDs in spark
- Where is implicit function for flatten defined?
- docker image error using docker container in scala
- the filtering in Scala's for loop
- Scala Akka Typed - pipeToSelf
- Modify immutable scala class field via reflection
- GAE JCache NumberFormatException, will I need to write Java to avoid?
- Scala how to convert Array[Map[String,String]] to Map[String, Map[String,String]]
- Scala-Spark - Try/Catch is changing type of case class
- i need sbt 0.11.2 to build the mongo auth app for lift
- Error with RDD[Vector] in function parameter
- spark cogroup/join KeyValueGroupedDataset with Dataset
- What is difference between calling someRDD.collect.foreach(println) vs someRDD.foreach(println)
- java.io.tmpdir when running java_test with bazel
- Deleting an element from a List in Scala using a loop and only a loop
- Updating Data in MongoDB from Apache Spark Streaming
- Why is Scala looking for SNAPSHOT version?
- How to convert csv file in S3 bucket to RDD
- How can I do a verbose compile in Play Framework?
- define your own exceptions with overloaded constructors in scala
- Scala implementation of sobel filter
- How can i delete elements on an Array of Struct Column on Spark scala
- Scala pattern matching in a concurrent program
- Why is there a rounding difference between my normal recursion and tail recursion example?
- How to generate random DateTimes with nscala-time
- How does Spark passing Scala object to map transformation
- How to extract numbers with decimals from a string in scala?
- Scala speed test and profiler
- Akka - StackOverflowError during object serialization
- Self type annotation not recognized by compiler