java.lang.ClassNotFoundException的Spark异常:de.unkrig.jdisasm.Disassembler

时间:2017-03-07 05:30:49

标签: scala apache-spark

我正在运行spark版本2.1.0并且我得到以下异常。我得到的结果却引发了异常

java.lang.ClassNotFoundException: de.unkrig.jdisasm.Disassembler
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.codehaus.janino.SimpleCompiler.disassembleToStdout(SimpleCompiler.java:430)
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:404)
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:311)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:229)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:196)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:91)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:935)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:998)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:995)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:890)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:357)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
    at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:225)
    at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:272)
    at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2371)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
    at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2765)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2370)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2377)
    at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2405)
    at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2404)
    at org.apache.spark.sql.Dataset.withCallback(Dataset.scala:2778)
    at org.apache.spark.sql.Dataset.count(Dataset.scala:2404)

我只是阅读文本文件并打印计数

val rdd = sparkLocal.read.text("/data/logs/file.log")
println(rdd.count)

但我得到了结果

这是我的build.sbt

libraryDependencies ++= {
  val akkaVersion = "2.4.10"
  val sparkVersion = "2.1.0"

  Seq(
    "com.typesafe.akka" %% "akka-actor"                           % akkaVersion,
    "org.apache.spark"  %%  "spark-core"                          % sparkVersion,
    "org.apache.spark"  %%  "spark-sql"                           % sparkVersion,
    "org.apache.spark"  %%  "spark-hive"                          % sparkVersion,
    "com.typesafe.akka" %% "akka-slf4j"                           % akkaVersion,
    "org.apache.spark"  %% "spark-streaming"                      % sparkVersion
  )
}

任何人都请帮助我

2 个答案:

答案 0 :(得分:6)

遇到同样的问题,这似乎是由this issue of the Janino compiler造成的。

该问题已经关闭,根据change log,修复程序应包含在2017-03-22发布的版本3.0.7中,该版本已经on Maven

我添加了一个明确的

libraryDependencies += "org.codehaus.janino" % "janino" % "3.0.7"

到我的build.sbt,到目前为止似乎解决了这个问题。

答案 1 :(得分:-1)

val rdd = sc.textFile(" /data/logs/file.log") 的println(rdd.count)

更多使用链接

https://databricks.gitbooks.io/databricks-spark-reference-applications/logs_analyzer/chapter1/spark.html