Spark 2.3.0错误-org.apache.spark.SparkException:在awaitResult中引发异常

时间:2019-02-07 13:27:49

标签: apache-spark

在集群中的spark 2.3.0中创建执行spark批处理spark context时,我在错误下面得到了异常

我正在通过spark-submit使用“

我在scala 2.11.9spark 2.3.0中编译了代码,并构建了jar。这是我的build.sbt

name := "XoomProfileScan"

version := "1.0"

scalaVersion := "2.11.9"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.0" % "provided"
libraryDependencies += "com.databricks" %% "spark-csv" % "1.2.0"
libraryDependencies += "com.typesafe" % "config" % "1.3.3"

创建Spark上下文时出现错误,这是我创建它的方式

val conf = new SparkConf().setAppName("XOOM_profile_SAR_filing")
val sc = new SparkContext(conf)

下面是完整的错误日志。 在我编译并在spark 1.3.0scala 2.10

中执行它时,它工作正常
19/02/06 23:23:05 ERROR ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Exception thrown in awaitResult: 
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:345)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply$mcV$sp(ApplicationMaster.scala:260)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$5.run(ApplicationMaster.scala:815)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
    at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:814)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:839)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.util.concurrent.ExecutionException: Boxed Error
    at scala.concurrent.impl.Promise$.resolver(Promise.scala:55)
    at scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$$resolveTry(Promise.scala:47)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:244)
    at scala.concurrent.Promise$class.tryFailure(Promise.scala:112)
    at scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:153)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:739)
Caused by: java.lang.ExceptionInInitializerError
    at com.paypal.xoom.sar.incremental.IncrementalDriver.main(IncrementalDriver.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: hash
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:235)
    at org.apache.spark.SparkEnv$.instantiateClass$1(SparkEnv.scala:260)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:324)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:176)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
    at com.paypal.xoom.sar.incremental.IncrementalDriver$.<init>(IncrementalDriver.scala:24)
    at com.paypal.xoom.sar.incremental.IncrementalDriver$.<clinit>(IncrementalDriver.scala)
    ... 6 more

0 个答案:

没有答案