我使用emr-5.4.0和Spark 2.1.0。我理解NullPointerException
是什么,这个问题是关于为什么在这种特殊情况下抛出的。
无法弄清楚为什么我在驱动程序线程中得到了NullPointerException。
我得到了这个奇怪的工作失败了这个错误:
18/03/29 20:07:52 INFO ApplicationMaster: Starting the user application in a separate Thread
18/03/29 20:07:52 INFO ApplicationMaster: Waiting for spark context initialization...
Exception in thread "Driver" java.lang.NullPointerException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:637)
18/03/29 20:07:52 ERROR ApplicationMaster: Uncaught exception:
java.lang.IllegalStateException: SparkContext is null but app is still running!
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:415)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:766)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
18/03/29 20:07:52 INFO ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: java.lang.IllegalStateException: SparkContext is null but app is still running!)
18/03/29 20:07:52 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Uncaught exception: java.lang.IllegalStateException: SparkContext is null but app is still running!)
18/03/29 20:07:52 INFO ApplicationMaster: Deleting staging directory hdfs://<ip-address>.ec2.internal:8020/user/hadoop/.sparkStaging/application_1522348295743_0010
18/03/29 20:07:52 INFO ShutdownHookManager: Shutdown hook called
End of LogType:stderr
我提交了这份工作:
spark-submit --deploy-mode cluster --master yarn --num-executors 40 --executor-cores 16 --executor-memory 100g --driver-cores 8 --driver-memory 100g --class <package.class_name> --jars <s3://s3_path/some_lib.jar> <s3://s3_path/class.jar>
我的班级看起来像这样:
class MyClass {
def main(args: Array[String]): Unit = {
val c = new MyClass()
c.process()
}
def process(): Unit = {
val sparkConf = new SparkConf().setAppName("my-test")
val sparkSession: SparkSession = SparkSession.builder().config(sparkConf).getOrCreate()
import sparkSession.implicits._
....
}
...
}
答案 0 :(得分:6)
将class MyClass
更改为object MyClass
,您已完成。
在我们处理此问题时,我也将class MyClass
更改为object MyClass extends App
并删除def main(args: Array[String]): Unit
(由extends App
提供)。
我已经报告了Spark 2.3.0的改进 - [SPARK-23830] Spark on YARN in cluster deploy mode fail with NullPointerException when a Spark application is a Scala class not object - 让它很好地报告给最终用户。
深入研究YARN上的Spark如何工作,以下消息是ApplicationMaster of a Spark application starts the driver(您使用--deploy-mode cluster --master yarn
spark-submit
时)。
ApplicationMaster:在单独的线程中启动用户应用程序
在INFO消息之后,您应该看到另一个消息:
ApplicationMaster:等待火花上下文初始化......
这是driver initialization when the ApplicationMaster runs。
的一部分异常Exception in thread "Driver" java.lang.NullPointerException
的原因归因于following code:
val mainMethod = userClassLoader.loadClass(args.userClass)
.getMethod("main", classOf[Array[String]])
我的理解是此时mainMethod
为null
,因此following line(其中mainMethod
为null
)&#34;触发&#34; NullPointerException
:
mainMethod.invoke(null, userArgs.toArray)
该线程确实称为Driver
(如Exception in thread "Driver" java.lang.NullPointerException
中所述),如this line中所设置:
userThread.setContextClassLoader(userClassLoader)
userThread.setName("Driver")
userThread.start()
行号不同,因为我在使用emr-5.4.0和Spark 2.1.0时使用Spark 2.3.0引用行。