Accepted answer

I've just begun to learn Spark, and I hope run Spark in local mode. I met a problem like yours. The problem: Failed to bind to: / Service 'sparkDriver' failed after 16 retries!

Because I just wanted to run Spark in local mode, I found a solution to solve this problem. The solution: edit the file (you can find it in your $SPARK_HOME/conf/) and add this into the file:


After that my Spark works fine in local mode. I hope this can help you! :)


It might be ownership issue as well

hadoop fs -chown -R deepdive:root /user/deepdive/


Above solution did not work for me. I followed these steps: How to start Spark applications on Windows (aka Why Spark fails with NullPointerException)?

and changed HADOOP_HOME environment variable in system variable. It worked for me.

Related Query

More Query from same tag