我正在尝试将应用程序提交到远程Spark 2.3.2集群(我可以从客户端计算机访问该集群)。我不断收到ERROR SparkContext:91 - Error initializing SparkContext. java.lang.NullPointerException
,但我不明白为什么。我已经能够在本地运行此代码,并且可以运行。
对于ERROR SparkContext:91 - Error initializing SparkContext. java.lang.NullPointerException
可能意味着什么的任何见解或指点都会受到赞赏
spark-submit --class com.mycode.example.Counter --master spark://my-remote-spark-master:7077 ./target/scala-2.11/example-hcl-spark-scala-mycodeblock.jar /users/mycode/input /users/mycode/output
...
2018-11-14 15:15:11 ERROR SparkContext:91 - Error initializing SparkContext.
java.lang.NullPointerException
at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:64)
at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:241)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
at com.mycode.example.Counter$.main(Counter.scala:27)
at com.mycode.example.Counter.main(Counter.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-11-14 15:15:11 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-11-14 15:15:11 INFO SparkContext:54 - SparkContext already stopped.
Exception in thread "main" java.lang.NullPointerException
at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:64)
at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:241)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
at com.mycode.example.Counter$.main(Counter.scala:27)
at com.mycode.example.Counter.main(Counter.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)2018-11-14 15:15:11 INFO SparkContext:54 - Successfully stopped SparkContext
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-11-14 15:15:11 INFO ShutdownHookManager:54 - Shutdown hook called
2018-11-14 15:15:11 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/0q/8hp24_5n78q2np59_lpm10fh0000gn/T/spark-91cc1f38-cb03-489f-90cb-2d65763dc0b9
2018-11-14 15:15:11 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/0q/8hp24_5n78q2np59_lpm10fh0000gn/T/spark-720b55a1-4723-42ac-87d4-befc06b0f2e4
答案 0 :(得分:0)
对于我来说,该问题已通过为用户创建HDFS / MapR-FS主目录解决。