Zeppelin:运行笔记本

时间:2016-09-21 06:28:01

标签: scala apache-spark gradle apache-kafka apache-zeppelin

我提供我的jar作为Zeppelin的spark解释器的依赖。在运行包含上述jar的服务调用的笔记本时,笔记本仍处于错误状态。 Zeppelin日志显示以下异常:

Caused by: java.lang.NoClassDefFoundError: scala/reflect/internal/AnnotationInfos$ErroneousAnnotation$
    at scala.tools.nsc.interpreter.ReplGlobal$$anon$1.newTyper(ReplGlobal.scala:34)
    at scala.tools.nsc.typechecker.Namers$Namer.<init>(Namers.scala:58)
    at scala.tools.nsc.typechecker.Namers$NormalNamer.<init>(Namers.scala:50)
    at scala.tools.nsc.typechecker.Namers$class.newNamer(Namers.scala:51)
    at scala.tools.nsc.interpreter.ReplGlobal$$anon$1.newNamer(ReplGlobal.scala:23)
    at scala.tools.nsc.typechecker.Analyzer$namerFactory$$anon$1.apply(Analyzer.scala:43)
    at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:430)
    at scala.tools.nsc.Global$GlobalPhase$$anonfun$run$1.apply(Global.scala:397)
    at scala.tools.nsc.Global$GlobalPhase$$anonfun$run$1.apply(Global.scala:397)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.tools.nsc.Global$GlobalPhase.run(Global.scala:397)
    at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1625)
    at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1610)
    at scala.tools.nsc.Global$Run.compileSources(Global.scala:1605)
    at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:388)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:804)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:763)
    at scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:939)
    at scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:934)
    at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:531)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:519)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:517)
    ... 18 more

为了简单介绍一下jar,它基本上是使用Spark流媒体读取和写入Kafka。

代码是用Scala编写的,我正在使用Gradle构建一个Fat-jar。 奇怪的是,当我使用SBT构建jar并且它工作正常(使用汇编插件)。 这是我的Gradle文件:

plugins {
    id "com.github.johnrengelman.shadow" version "1.2.3"
}
group 'com.demo'
version '1.0-SNAPSHOT'
apply plugin: 'java'
apply plugin: 'scala'
sourceCompatibility = 1.8

configurations { providedCompile }

repositories {
    mavenLocal()
    mavenCentral()
}

dependencies {
    providedCompile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.0.0'
    providedCompile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.0.0'
    compile group: 'org.apache.spark', name: 'spark-streaming_2.11', version: '2.0.0'
    compile group: 'org.apache.spark', name: 'spark-streaming-kafka_2.11', version: '1.6.2'
    compile group: 'org.apache.spark', name: 'spark-catalyst_2.11', version: '2.0.0'
    compile group: 'org.apache.kafka', name: 'kafka_2.11', version: '0.9.0.1'
    compile group: 'org.apache.kafka', name: 'kafka-clients', version: '0.9.0.1'
    testCompile group: 'junit', name: 'junit', version: '4.11'
}



shadowJar {
    zip64 true
}

build.dependsOn(shadowJar);

sourceSets.main.compileClasspath += configurations.providedCompile
sourceSets.test.compileClasspath += configurations.providedCompile
sourceSets.test.runtimeClasspath += configurations.providedCompile

1 个答案:

答案 0 :(得分:1)

这是spark解释器中的依赖冲突。将以下内容添加到您的排除项中可能会解决您的问题:

org.scala-lang:scala-library,org.scala-lang:scala-reflect