您好我正在通过shell脚本运行sparkr progrm。我将输入文件指向本地意味着它工作正常,但是当我指向hdfs意味着它会抛出错误。
Exception in thread "delete Spark local dirs" java.lang.NullPointerException
Exception in thread "delete Spark local dirs" java.lang.NullPointerException
at org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:161)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:141)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1617)
at org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:139)
任何帮助将不胜感激。
答案 0 :(得分:0)
我遇到了与Scala脚本相同的问题。问题在于主URL,因此我删除了设置主URL。
此前:
val conf = new org.apache.spark.SparkConf().setMaster(masterURL).set("spark.ui.port",port).setAppName("TestScalaApp")
固定代码:
val conf = new org.apache.spark.SparkConf().setAppName("TestScalaApp")