score:0
i have finally been able to figure this one out.
the root cause was that an older version of hadoop-core was being pulled in (1.2.1 instead of 2.6.5), which does in fact not have the configuration.getpassword()
method. i found this out after setting up a test project where the sparkcontext was initialized correctly and then checking the source jars of the two configuration classes in the two projects (using configuration.class.getprotectiondomain().getcodesource().getlocation().getpath()
).
after forcing version 2.6.5 using maven's dependency management and after manually deleting the older 1.2.1 jar from the local maven repo, it work fine.
the only thing that i still don't understand is why the hadoop-core
was not showing up in the maven dependency tree. otherwise, i would have found sooner (probably).
Source: stackoverflow.com
Related Query
- How to set hadoop configuration values from pyspark
- strange error message: bad symbolic reference. A signature in package.class refers to term apache in package org which is not available
- sbt - object apache is not a member of package org
- Hadoop Configuration in Spark
- Hadoop s3 configuration file missing
- Maven build ERROR (Scala + Spark):object apache is not a member of package org
- eclipse(set with scala envirnment) : object apache is not a member of package org
- Object apache is not a member of package org
- scala console error: object apache is not a member of package org
- How to set up prod mode to have configuration file in conf directory (not inside jar)?
- Error. Object apache is not a member of package org
- Getting Hadoop OutputFormat RunTimeException while running Apache Spark Kafka Stream
- Set hadoop configuration in spark 3.0 - RuntimeException
- How to set Spark configuration properties using Apache Livy?
- Error while running sbt package: object apache is not a member of package org
- Apache Spark - passing configuration to nodes
- NoSuchMethodError on apache spark. Sbt dispatcher library not found
- Using of hadoop configuration at spark worker
- Scala version issues while using spark runtime configuration on Apache Hop
- Apache pulsar: Akka streams - consumer configuration
- Spark 2.4.3 - Scala 2.12.3 - object apache is not a member of package org
- Problems in the configuration between hadoop and spark
- Apache Spark NoSuchMethodError using .reduceByKey()
- Read custom configuration from conf directory
- Configuration to run Hadoop with HBase
- Apache Spark runtime exception "Unable to load native-hadoop library for your platform" despite not using or referenceing Hadoop at all
- Apache Spark: map vs mapPartitions?
- java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. spark Eclipse on windows 7
- Extract column values of Dataframe as List in Apache Spark
- Apache Spark logging within Scala
More Query from same tag
- How to assign value into a breeze Matrix in flatMap Scala-Spark?
- Reference a java nested class in Spark Scala
- Filtering rows which are causing datatype parsing issue in spark
- Is it possible to have a map of {key -> function call} in Scala?
- Touching a file with SBT
- Is it possible to use a typeclass to implement a trait?
- gatling after method not reached
- How to uncache RDD?
- FastParse, search an expression in a free text
- Play framework, 2.x clean project, test fails - MAC OS
- How can I define a recursive function type in scala?
- Scala package objects and Play! 2
- Can I compose a function that has an implicit argument?
- Slick 3: How to construct dynamic filter with optional column?
- How to expand horizontal data to vertical in dataframe?
- Is constructor use allowed with case classes?
- How to obtain all possible members of a coproduct
- simultaneous operations on multiple files in spark scala
- How do I resolve my own test artifacts in SBT?
- Inner trait breaks implicit parameter
- How to convert PasswordAuthentication to Authenticator in Scala?
- scala cloud foundry mongodb : access to mongodb denied
- compare values of two pair of RDDs based on key in scala
- How to call a method n times in Scala?
- spark recursively fix circular references in class
- Read blob types from cassandra in spark with spark-cassandra-connector
- Generically Finding Max Item in List
- how to get more debugging for sbt compile error? "MethodHandle not found"
- How do error handling monads like Eithers achieve referential transparency?
- Spark Joins with None Values